Digital Asset Management System – A must-have for all businesses

What are digital assets?

Wikipedia definition: “​​A digital asset is anything that exists in a digital format and comes with the right to use”. For example – video, music, documents, images, presentations, digital tokens (including crypto), data, or anything an organization or individual owns or has the right to use.

As we move ahead with digital transformation more and more businesses are increasingly dependent on digital assets. Today, even existing physical assets such as documents and prints are actively digitized. Digital assets are convenient as they occupy less physical space, are easy to retrieve, and can be transported/transferred easily.

Businesses who have already made the shift to digital assets include, 

  • Legal / Law firms
  • Advertising Agencies, Media houses
  • Broadcasting
  • HR and Recruitment firms
  • Movie Production houses
  • OTTs

Major industries such as retail, manufacturing, import-export houses, insurance, finance, and logistics companies are all in various stages of digital transformation.

With increasing convenience comes its own set of problems, and in this case, it is the management of the digital assets that we create and convert from the existing ones. This is especially true for Business Service companies that create, use and distribute different types of documents and related content. 

How it starts

Every individual and organization starts by organizing their files and their assets in a traditional hierarchical system on their local computers,  USB storage devices,  and of late on the cloud ( Google Drive, email, Dropbox, etc.). Once there is a need to share these assets and use them in collaboration, they resort to shared drives and transfer these assets via email, etc. 

While this kind of organization works on a small scale, the system gets easily overwhelmed with an increase in the number of users and assets.  

Eventually, the challenges present themselves:

  • Single paradigm of classifying our assets – different users / functional-units classify assets differently. E.g. Sales dept will want contracts classified by customers or geography while the accounts teams may want them classified by chronology, billing, risk etc. In short, one size does not fit all.
  • Sharing assets with others – Providing access to “other teams” or third parties is initially simple and can be monitored. However over time, as the content and the teams involved increases, it can spiral into a complete chaos. The most ideal use case would be to provide access to specific assets and probably for a finite amount of time. This brings us to the next point.
  • Security of assets – In 2015, all the first four episodes of the Game of Thrones season surfaced online before it even got aired because the Media outlets provided the episodes for viewing as a part of the review process. This was catastrophic. Sensitive content especially of monetary value needs to be secured and there should be an audit trail to trace any leaks.
  • Version control – While presentation.ppt,  presentation1.ppt, presentation-ver2.ppt would work for an individual or at a small team level, it would require additional tracking effort or worse cause confusion under unwanted circumstances.
  • Automation – Digital assets typically go through a standard workflow including (not limited to) publishing onto websites, pushing to 3rd parties, Watermarking, QA  QC, Approvals etc which could be potentially automated to provide better efficiency.

Enforcement is a key challenge in a discipline-based system and things get cumbersome. There are several Sophisticated DAMs available in the market and when the time comes it is best to get one in place. 

When is the right time to consider a DAM?

Adopting the right technology at the right time is significant for the growth of any business. Here are some points that will help you identify if it is the right time to adopt a DAM in your business 

  1. Are digital assets a significant part of your business?
  2. Does your workforce spend a lot of time looking for files?
  3. Have you had to do a work from scratch when it could have been repurposed from an existing asset?
  4. Are you making duplicate purchases of assets because existing assets cannot be found?
  5. Are unapproved files being used fairly regularly?
  6. Are you losing time validating the “Final version” against the other versions?
  7. Are you spending a significant amount of  time on tasks that can be automated such as watermarking, resizing, transcoding etc?
  8. Does sharing large files require a process which is not as easy as sending email?
  9. Are you finding difficulty in identifying a secure store for your assets?
If you have 3 or fewer “yes”You still have some time. Keep a sharp lookout for the most common cases mentioned. 
If you have 4 – 6 “yes”It is time to start looking for a DAM. It is also a good time to get familiar with a Digital Asset management system. 
If you have more than 6 “yes”Now might be a good time to get your DAM in place.

The losses and risks associated with the loss of Digital Asset Management systems are becoming a standard around the world. The cost of loss and efficiency is real and it has a direct impact on your business.

Hence ensure to be proactive rather than reactive. Also keep in mind that once you have identified the DAM and Vendor, there is still time left (you are the best judge of this) for Deployment, Migration, and User-acceptance. Ensure you plan it well to make this initiative successful. 

Find the right DAM

Once the decision is made to go in for a Digital Asset Management system, there are several choices that need to be made. Broadly they are based on capability/features and cost model.

Features and capability

Consider the following features:

  • Types of assets you will store on the DAM. E.g. Audio, documents, images etc.
  • Attributes of indexing for search and retrieval. E.g. content keywords, Approval status, date, value, vendor etc
  • AI based DAMs can automatically tag features for indexing such as contents of scanned documents, image contents, video and audio content keywords which makes content ingestion a much simpler step 
  • Any automated processes you would like to run on the assets – watermarking, transcoding, resizing
  • Federated Authentication – Consider a DAM that will be able to integrate with your existing authentication system so that the existing system Admin processes will take care of your access management and the users will not have to remember another set of credentials
  • Sharing and permissions – the access various users have to the assets or groups of assets
  • Compatibility with your existing platform and software
  • Any APIs that need to be integrated with the DAM

Buy vs Hire

There are many solutions that can be bought off the shelf, configured, and deployed onto the cloud of local infrastructure based on your requirement. If you already have IT infrastructure and personnel then this is probably a good approach. 

OR

Several DAM solution companies offer a SaaS model where you can just pay a monthly fee and everything is handled. This is typically a good option if you don’t want the upfront expenses or don’t have a dedicated infrastructure team.

Migrate to a Digital Asset Management System

By now you should have zeroed in on the Digital Asset Management system if not already purchased one or subscribed to one.

  • Make sure all the use-cases of all the teams involved are handled. All integrations are in place and all the automated processes are working with their respective types of assets.
  • Ensure you have a buy-in from all the stakeholders involved about the move and set a date.
  • Create the required structure and the attribute lists.
  • Ensure all potential users get their credentials on the new system 
  • Provide training to all the personnelle who will access the DAM
  • Move / Import all existing Assets to the DAM and ensure all new assets are added to the new system.
  • Decommission the old system. This is a very important step as “old habits die hard” and familiarity makes users go back to the older system.

Some popular DAMs

Here are some popular DAMs as per industry leadership sites. Most of these are SAAS-based models. These are pay-as-you-go models and can be a good starting point.

  • Bynder 
  • Canto
  • Digizuite
  • Image Relay
  • Northplains
  • Widen Collective

For the more adventurous ones who already have IT infrastructure and a team that can manage the system, here are some open source options:

  • Islandora 
  • Phraseanet
  • Pimcore
  • Daminion Standalone Basic – The basic standalone is free. They also have a managed service which is a paid model.

A good approach here is to involve your technical team to check on technical skills compatibility and also evaluate the features and their maturity. Even better is to deploy a working copy and test out all the use cases required by all the teams. Most of the open-source projects come with APIs and defined frameworks to extend their functionality.

Confused? 

Get in touch with us for a quick free assessment of your requirement and suggestion for a suitable solution. 

The Best Test Data Management Practices in an Increasingly Digital World

A quick scan of the application landscape shows that customers are more empowered, digitally savvy, and eager to have superior experiences faster. To achieve and maintain leadership in this landscape, organizations need to update applications constantly and at speed. This is why dependency on agile, DevOps, and CI/CD technologies has increased tremendously, further translating to an exponential increase in the adoption of test data management initiatives. CI/CD pipelines benefit from the fact that any new code that is developed is automatically integrated into the main application and tested continuously. Automated tests are critical to success, and agility is lost when test data delivery does not match code development and integration velocity.

Why Test Data Management?

Industry data shows that up to 60% of development and testing time is consumed by data-related activities, with a significant portion dedicated to testing data management. This amply validates that the global test data management market is expected to grow at a CAGR of 11.5% over the forecast period 2020-2025, according to the ResearchandMarkets TDM report.

Best Practices for Test Data Management

Any organization focusing on making its test data management discipline stronger and capable of supporting the new age digital delivery landscape needs to focus on the following three cornerstones.

Applicability:
The principle of shift left mandates that each phase in an SDLC has a tight feedback loop that ensures defects don’t move down the development/deployment pipeline, making it less costly for errors to be detected and rectified. Its success hinges to a large extent on close mapping of test data to the production environment. Replicating or cloning production data is manually intensive, and as the World Quality Report 2020-21 shows, 79% of respondents create test data manually with each run. Scripts and automation tools can take up most heavy lifting and bring this down to a large extent when done well. With production quality data being very close to reality, defect leakage is reduced vastly, ultimately translating to a significant reduction in defect triage cost at later stages of development/deployment.

However, using production-quality data at all times may not be possible, especially in the case of applications that are only a prototype or built from scratch. Additionally, using a complete copy of the production database is time and effort-intensive – instead, it is worthwhile to identify relevant subsets for testing. A strategy that brings together the right mix of product quality data and synthetic data closely aligned to production data models is the best bet. While production data maps to narrower testing outcomes in realistic environments, synthetic data is much broader and enables you to simulate environments beyond the ambit of production data. Usage of test data automation platforms that allocates apt dataset combinations for tests can bring further stability to testing.

Tight coupling with production data is also complicated by a host of data privacy laws like GDPR, CCPA, CPPA, etc., that mandate protecting customer-sensitive information. Anonymizing data or obfuscating data to remove sensitive information is an approach that is followed to circumvent this issue. Usually, non-production environments are less secure, and data masking for protecting PII information becomes paramount.

Accuracy:
Accuracy is critical in today’s digital transformation-led SDLC, where app updates are being launched to market faster and need to be as error-free as possible, a nearly impossible feat without accurate test data. The technology landscape is also more complex and integrated like never before, percolating the complexity of data model relationships and the environments in which they are used. The need is to maintain a single source of data truth. Many organizations adopt the path of creating a gold master for data and then make data subsets based on the need of the application. Adopting tools that validate and update data automatically during each test run further ensures the accuracy of the master data.

Accuracy also entails ensuring the relevance of data in the context of the application being tested. Decade-old data formats might be applicable in the context of an insurance application that needs historic policy data formats. However, demographic data or data related to customer purchasing behavior applicable in a retail application context is highly dynamic. The centralized data governance structure addresses this issue, at times sunsetting the data that has served its purpose, preventing any unintended usage. This also reduces maintenance costs for archiving large amounts of test data.

Also important is a proper data governance mechanism that provides the right provisioning capability and ownership driven at a central level, thereby helping teams use a single data truth for testing. Adopting similar provisioning techniques can further remove any cross-team constraints and ensure accurate data is available on demand.

Availability:
The rapid adoption of digital platforms and application movement into cloud environments have been driving exponential growth in user-generated data and cloud data traffic. The pandemic has accelerated this trend by moving the majority of application usage online. ResearchandMarkets report states that for every terabyte of data growth in production, ten terabytes are used for development, testing, and other non-production use cases, thereby driving up costs. Given this magnitude of test data usage, it is essential to align data availability with the release schedules of the application so that testers don’t need to spend a lot of time tweaking data for every code release.

The other most crucial thing in ensuring data availability is to manage version control of the data, helping to overcome the confusion caused by conflicting and multiple versioned local databases/datasets. The centrally managed test data team will help ensure single data truth and provide subsets of data as applicable to various subsystems or based on the need of the application under test. The central data repository also needs to be an ever-changing, learning one since the APIs and interfaces of the application keeps evolving, driving the need for updating test data consistently. After every test, the quality of data can be evaluated and updated in the central repository making it more accurate. This further drives reusability of data across a plethora of similar test scenarios.

The importance of choosing the right test data management tools

In DevOps and CI/CD environments, accurate test data at high velocity is an additional critical dimension in ensuring continuous integration and deployment. Choosing the right test data management framework and tool suite helps automate various stages in making data test ready through data generation, masking, scripting, provisioning, and cloning. World quality report 2020-21 indicates that the adoption of cloud and tool stacks for TDM has witnessed an increase, but there is a need for more maturity to make effective use.

In summary, for test data management, like many other disciplines, there is no one size fits all approach. An optimum mix of production mapped data, and synthetic data, created and housed in a repository managed at a central level is an excellent way to go. However, this approach, primarily while focusing on synthetic data generation, comes with its own set of challenges, including the need to have strong domain and database expertise. Organizations have also been taking TDM to the next level by deploying AI and ML techniques, which scan through data sets at the central repository and suggest the most practical applications for a particular application under test.

Need help? Partner with experts from Trigent to get a customized test data management solution and be a leader in the new-age digital delivery landscape.