How to Improve DAM User Adoption and Maximize ROI

By Emily Quan

For a lot of DAM managers, the phase right after the rollout of a new or improved DAM system can be a bit nebulous. How do you measure user adoption? How do you ensure user adoption?There’s a high chance that the DAM is competing for your users’ attention alongside other systems that have a bigger impact on the bottom line. Balancing competing priorities and adapting to new technologies can be confusing or simply frustrating for individuals, posing significant risks to user adoption. A comprehensive and analytics-driven view of asset use and value, user activity trends, and user experience (UX) improvement opportunities will be key to the program’s success. This blog outlines approaches to applying asset metadata audits and dashboard reporting, to encourage user adoption, and appropriately assess and demonstrate DAM ROI.


User Adoption

Achieving high user engagement and active participation post-launch is challenging, especially if metadata quality is poor, which can make users lose confidence that the DAM system will return expected search results. Your user adoption strategy must be supported by effective communication and targeted messaging to internal users, external third party users, user groups, and user levels, given varying levels of technical expertise and user permissions. Establishing a platform and culture for continuing education and training will take time to socialize, but empowering users to properly upload, download, share assets, and apply metadata is invaluable to improving user adoption.


Asset Metadata Audits

The post-launch breather may leave you with an initial appetite to view statistics and gain insights into user activity and asset value. However, gathering these insightful data can be challenged by little or no reporting data, often in undesirable qualitative and quantitative formats. Regular system audits and usage reporting are imperative processes that aide in maintaining a robust DAM system and assist in evaluating ROI. Auditing sheds light on why users are unable to locate specific assets, what data (or lack thereof) is hindering findability speed, and what why users continue to explore workarounds outside the DAM to find assets.


Audits and reports can give you insights into what is going wrong, but there are some techniques you can use to understand why it is going wrong. Testing a sample set of assets by uploading test assets and performing day to day tasks i.e. upload, search, and retrieval tasks, can validate any hypotheses that you may have identified during the audit process. This practice of further investigating audit and report observations can help you continue to improve the metadata model, understand gaps, identify duplicates, and initiate conversations with the end users and the system vendor.

Figure 1. Asset metadata audit cycle. Repeat actions 1-5 to enhance the DAM system.

Sample metadata audit cycle outcomes:

  • Incomplete asset metadata
  • Inconsistencies of values in uncontrolled text fields
  • Required metadata fields somehow blank
  • Overlooked relationships between metadata attributes, values, and folder automations

While user data entry errors are mitigated with pre-defined controlled vocabularies, errors in manual entry fields are inevitable. System bugs may also have slipped through testing, causing issues such as blank required fields. These findings typically require modifying the metadata model and training documentation to accurately reflect the desired metadata application practices and ensure users are educated on how their actions contribute to the usability or failure of the system.


Dashboard Reporting

Dashboard reporting is a valuable tool and a visual representation used to communicate data-driven insights, effectively demonstrate ROI to various business groups, and measure the DAM strategy against program and organizational goals. The report quality and user interface of your DAM system statistics is dependent on the vendor’s system intelligence and reporting capabilities. Many out-of-the-box (OOTB) DAM systems will provide pre-built reports that may not address your reporting needs. The desire for customized reports and dashboard reporting solution may require leadership buy-in.

Reporting considerations:

  • Identify the options and the resources required. What reporting solutions does the vendor offer? Is there staff to support this effort? 
  • Document and prioritize your goals. What do you want to know about the users and about the assets? This will provide you with well-defined metrics and Key Performance Indicators (KPIs) with which to measure ROI.
  • Leverage the collective intelligence in the organization to gain an understanding of how other systems are reporting data. What reporting tools does your organization offer?

Figure 2. The reporting process to maximize DAM ROI. Repeat actions 1-6.

The outputs of the recommended considerations will be metrics, KPIs, and success criteria that provide insight into user adoption and the assets that deliver the most user engagement and value. 



A DAM system affords a source of truth for your organization's assets given proper management, user engagement, recurring system assessments, and asset evaluation. The more you educate, listen, and learn from your DAM system users, the more that you will not only improve system user experience and usability, but will empower your users and maximize ROI. Proactively pursuing the tactical steps presented in this blog to improve your DAM solution is the first step towards addressing the challenges of rolling out yet another information system. 



Emily Quan is an Associate in Optimity’s Information Management practice. She specializes in digital asset management (DAM) strategy and implementation, developing enterprise content management solutions and business process improvements. Emily has positioned her clients with effective digital content strategies, increased operational efficiencies and improved user experience.

Asset 7
  • Washington, D.C.
  • Berlin
  • Los Angeles
  • Minneapolis
  • New England
  • New York