Two Compelling Reasons for Application Modernization

Modernizing existing applications is about embracing change and it is not human to want to re-engineer something that is already working! However, the world today is about digitization, cloud transformation, robotics and analytics. In this brave new world, being stuck with legacy applications can be suicidal for businesses. In spite of the reasons for change being so explicitly clear, many organizations find it extremely difficult to migrate their legacy applications. The reason in most cases arises from the fear that existing systems and architecture may make migration complex and affect existing processes. In some cases the need to change is not clearly visible and modernization’s weighted advantages do not make a strong case for change. In a few cases it could simply be inertia.

The fear of change or modernization is the primary reason why many organizations are stuck with legacy systems. However, shifting economic and competitive landscapes demand organizations to continually reinvent their information environment. This is required to achieve scalability and cost savings – both vital for survival.

The advantages of modern architecture such as scalability, non-disruptive services and upgrades, easy deployment provide companies with the ammunition to move forward. While CIO’s may shudder with fear at the thought of mass changes to their IT structure, the opportunities out there are immense. For example, cloud has reached a level of maturity and reliability making it a viable substitute to IT infrastructure. Users demand more and their noise and clutter more voluble. Enterprise technology is cheaper and as good, if not even better than box software.

Two compelling reasons for migration:

User Expectations

Customers are growing accustomed to integrated experiences and that is a fact. Whether to do their banking transactions, booking airline tickets or ordering food, they expect their applications to exceed expectations. Uber, Airbnb, Netflix are the brain children of these expectations. They are giving customers what they want and that is helping their revenues to add up better and faster. According to a Forrester report, organizations modernize for a variety of reasons, but the common themes are faster response to rapidly changing markets, empowered customers, and nimble competitors. (“Application Modernization, Service by Microservice” –Forrester 11/20/2015)

Cost-Effective Options

Why pay more when you can get it for less? Makes absolute logical sense, which is why the cloud is enveloping the infrastructure space. Rigid, unwieldy networks are being replaced by secure, always available cloud infrastructure and there are multiple options available making it even more viable.

Open source technologies are great enablers of change. They are paving the way for businesses to keep ahead of technology innovation at an unimaginable pace.

We manage our clients’ mission critical applications, and help them focus on their strategic business goals

For example, a real estate company had a C++ thick-client, Windows only application. Integration with other applications was slow and difficult. Modernization resulted in a cloud-based web application which simplified usability resulting in increased volume of traffic to the site. The slick platform is scalable and user-friendly. For the real-estate company, the platform is a business enabler.

The central factor behind application modernization lies in retaining the platform’s ethos and adding features to enhance its power to perform better. When organizations stop thinking of their IT as separate from the business and instead see it as a business enabler the two clinchers given above will make absolute sense. As Gartner advises, “Application modernization efforts require an organizational-level strategy and plan that involves all aspects of the IT organization.

SQS Messaging Service in AWS

AWS SQS (Simple Queue Service), as the name indicates is a fully managed messaging FIFO (First In First Out) queue service that receives and sends messages from any software system. But it is generally used for distributed computing. AWS SQS is secure, durable, scalable and a reliable service. AWS provides SDK’s in various languages to access SQS services.

In this blog I will use PHP AWS SDK to send, receive and delete messages from SQS.

Given below are the steps to be followed:

Please log into: AWS Account

and install/configure AWS SDK on to a local system. I am assuming that there is already an available AWS account.

AWS accepts only https requests, so install dummy ssl certifcate in a local WAMP/XAMPP.

After installation of AWS SDK and SSL, SQS needs to be configured.

  1. Go to the AWS console, choose ‘preferred availability zone’ and `simple queue service’. Click on ‘create new queue’.
  2. Enter the name of the queue, for example, php-demo-queue
  3. Set Default Visibility Timeout to 5 minutes. This option is to make the messages invisible for 5 minutes once it goes for processing from the queue. The maximum time is 12 hours.
  4. Set Message Retention Period to 14 days. This option is to make the messages available in the queue for maximum of two weeks if we do not delete the message.
  5. Set Maximum Message Size to 256 kb. The message should not exceed 256 kb.
  6. Set Delivery Delay to 0. This will tell the queue to show the message as soon as it comes to the queue. If one does not want the message to be visible instantly, then give Delivery Delay up to 15 minutes.
  7. Set Receive Message Wait Time to 0 and click on Create Queue.

Note 1: Once the queue is created, it is possible to access edit/attribute options, rather than edit them via API Calls.

Note 2: To create a FIFO Queue, the queue name has to be prefixed with .fifo For Ex. php-demo-queue.fifo

Transform your applications to the cloud

After the Queue is created, now is the time to add/edit permissions.

1) Select the queue name to add permissions.

2) Click Permissions tab and click on Add a Permission button.

3) Select Effect to Allow, Principal to Everybody and Actions to All SQS Actions and Click on Save Changes

List of Methods available in AWS SQS

  1. changeMessageVisibility()
  2. changeMessageVisibilityBatch()
  3. createQueue()
  4. deleteMessage()
  5. deleteMessageBatch()
  6. deleteQueue()
  7. getQueueAttributes()
  8. getQueueUrl()
  9. listDeadLetterSourceQueues()
  10. listQueues()
  11. purgeQueue()
  12. receiveMessage()
  13. removePermission()
  14. sendMessage()
  15. sendMessageBatch()
  16. setQueueAttributes()

Example 1:- Get url of the queue, get queue attributes, send messages, receive message and delete the message from queue. This example is about a Film ticket booking system. The user has provided his information and seating capacity. The main server receives user information, stores info, payment done. Now to generate a QR Code, this is sent to another dedicated server where only QR Codes are generated and messaged to user and updates DB.

$config = [
 'region' => 'ap-south-1',
 'version' => 'latest',
 'credentials' => [
 'key' => AWS_ACCESS_KEY_ID,


 $sqsClient = new AwsSqsSqsClient($config);
 $stdUrl = $sqsClient->getQueueUrl(array('QueueName' => "test-std-queue"));
 $queueUrl = $stdUrl->get('QueueUrl');
 $queueAttributes = $sqsClient->getQueueAttributes(['QueueUrl' => $queueUrl, 'AttributeNames' => ['All']]);
 $attributes = $queueAttributes->get('Attributes');
 $message = [
 'id' => uniqid(),
 'cust_name' => 'Demo User',
 'cust_email' => '',
 'cust_phone' => '987654321',
 'cust_seating' => ['A1','A2','A3'],
 'theatre_id' => 500,
 'amount_paid' => 1000,
 'discount' => 100
 $messageResult = $sqsClient->sendMessage(['QueueUrl' => $queueUrl, 'MessageBody' => json_encode($message)]);
 $receiveMessages = $sqsClient->receiveMessage(['QueueUrl' => $queueUrl, 'AttributeNames' => ['All']]);
 $msg = $receiveMessages->get('Messages');
 $receiptHandle = $msg[0]['ReceiptHandle'];
 'QueueUrl' => $queueUrl,
 'ReceiptHandle' => $receiptHandle,
 } catch (AwsExceptionAwsException $ex) {
 echo $ex->getMessage();
 } catch (Exception $ex) {
 echo $ex->getMessage();

Note 1: All methods will return appropriate messages and status codes.

One can store the responses for future investigations.

Note 2: The body of the message can be in any format, for example, JSON, XML, Text, Paths to files or images which should not exceed 256 kb.

Dead Letter Queue: If a message is received for X number of times, then it is considered as a Dead Letter Queue. Supposing a message is received by the server over 50 times and still not processed successfully, then it is considered as Dead Letter Queue and sent to Dead Letter Queue.

To configure Dead Letter Queues, we need to create a queue as mentioned in the above steps. For example test-std-queue-dlq.

Then add the queue to Dead Letter Queue Settings for test-std-queue.

Can Small Businesses Benefit from Big Data?

All organizations irrespective of their size generate volumes of data. However, for SMBs, the question is, does the cost and effort justify the value to be derived from data? Data analytics provides deep insights that complement human judgement. Forrester describes the power of big data for small business as “A major disruption in the business intelligence and data management landscape.”

There are several success stories of small businesses benefiting from data analytics. One interesting story is of a zoo in Washington State that was unable to plan daily staffing commensurate with attendance. The zoo’s major source of income was through attendance, which was highly dependent on weather.

By parsing historical data, and analyzing it against decades of local weather data, they found some predictable intelligence. This helped them to fine-tune their plans regarding staffing and promotional activities.

The fear of big data is probably related to the word `big,’ and small companies wonder if they have enough data to qualify for big data. It does not matter. Any data, including visitor logs from a website, is enough to provide vital information on customer behavior.

Another reason why SMBs shy away from big data could be the lack of streamlined processes and information silos. A lease-management company in North Carolina, that manages nearly 1,000 rental properties in the Outer Banks, was unable to accurately predict profitability for homeowners through tourist rentals. With data stored in spreadsheets, the management found it impossible to analyze the data that they had amassed over the years. The company opted for a business analytics tool, which distilled the data and simplified the available information.

Based on the analytics, the company could share vital information with its guests. They could now make rental-pricing recommendations to owners based on seasonal trends and so forth. The business has grown by over 10 percent and costs reduced by 15 percent in the last three years. Big data analytics for small business also helped this company to identify invoice-processing errors, and overall it saved $50,000, annually.

Leverage our Big Data Services to get insights from your Structured and Unstructured Data Repositories

Smaller organizations focused on business needs may not have the time, or even not see the need for streamlined processes. Big data makes allows us to think about the current strategy, economic environment, and competitive landscape. To move from small to medium and from there to large requires processes. Incorporating them now can help to mine data, which will be useful in the short and long term.

To summarize, big data for small business helps small organizations to watch and learn about their customers and their preferences. Even if it is just from their website, it is still intelligence. For retaining customers and acquiring new ones, for up selling and cross-selling, for streamlined processes, which lead to operational efficiency, big data has a hoard of benefits that simply cannot be ignored!


How Cloud Computing is Impacting Healthcare Services

The healthcare segment is witnessing a healthy growth fueled by an aging population and increased focus on wellness by consumers. The current estimation is 11-17% increase in demand for healthcare resources between 2014 and 2025. Healthcare organizations are meeting this demand by focusing on IT-enabled patient care. Cloud based healthcare services seems to in a position of strength to meet and exceed these demands with several organizations migrating their existing applications to the cloud.

While there are several advantages of healthcare cloud services specific trends favoring the cloud are:

  1. Cloud-based IT solutions for value-based consumerism which rewards healthcare professionals and organizations on the basis of outcome and cost-effectiveness.
  2. Regulatory compliance requirements that have paved the way for cloud-based electronic healthcare record (EHRs).
  3. Digitization that has placed the overall control in the hands of consumers, requiring IT modernization with cloud computing at the center. This information-centric approach has ensured seamless collaboration, cooperation and information sharing.
  4. Healthcare delivery transformation which has helped healthcare to transcend distance, time and local practices. This ensures seamless collaboration in real-time which only cloud-enabled IT can provide.

Benefits of Cloud Computing

  1. For healthcare providers any innovation should result in business benefits and cost savings can never be undervalued in importance. Cloud computing provides cost flexibility with the potential to reduce costs on an ongoing basis. Capital expenditure allocated to infrastructure can be avoided and this itself becomes a huge cost saving along with cost savings resulting from pay-as-you-use models.
  2. Benefits, when not related to cost, are often tied to operational efficiency. The cloud offers scalability and the ability to adjust to demand. It offers seamless communication and collaboration advantages which lead to optimized operational efficiency.
  3. Cloud-based healthcare services providers ensure that their customers benefit from the superior security and their data remains well protected. Cloud service providers offer sophisticated controls including data encryption and access control, avoiding the need to store information on local devices.
  4. Healthcare functionality is enhanced greatly by cloud services which offer the potential for broad interoperability and integration. Cloud-based services help healthcare systems to remain connected enabling remote access to applications and data.
  5. Along with functionality enhancement, cloud computing offers new capabilities to implement better ways of working with patients, improve patient care and on a macro level better healthcare management. Cloud services can support healthcare providers’ staff cognitive capabilities to mitigate medical mistakes and minimize errors in judgement.

Innovative Healthcare solutions for ISV’s & Providers

In the recent past, several healthcare organizations, ranging from small private clinics to large hospitals, solution providers to insurance companies have begun adopting cloud computing. These organizations are reaping the advantages of cloud computing by automating and orchestrating their virtual assets. They are better equipped to mitigate disasters and are reaping the benefits of real-time intelligence. Over and above all this, IoT augmented patient care, cognitive assistance to medical professionals and economies of scale are pushing more and more organizations to opt for the cloud. These benefits extend beyond cost and service level drivers to improved responsiveness with internal business partners and decreased administrative overhead.

Can DataOps Help Data Scientists to Deliver Increased Business Value?

In the digitally transformed world, as businesses continue to grow, the pressure on data scientists to deliver workable models, in accelerated time is immense. In a typical scenario, when a valuable insight into the data has been seen, data scientists have to make this production ready, i.e. utilize this data through an organization and integrate it into the business process. However, it is difficult for data scientists to predict accurately how algorithms and models will perform in production, keeping in mind the fact that the conditions surrounding legacy data may not be applicable or even pertinent in evolving times.

Most of us are already familiar with DevOps, a practice that embraces collaboration of IT operations with software development resulting in faster pace for going to market. An offshoot of DevOps, DataOps is a buzzword making its rounds in the world of data science. DataOps is designed to do away with roadblocks when developing or deploying data-intensive applications like the predictive models build by data scientists. Gartner has defined DataOps as “The hub for collecting and distributing data, with a mandate to provide controlled access to systems of record for customer and marketing performance data, while protecting privacy, usage restrictions and data integrity”.

Related: Make data-driven decisions to improve business results and manage risk

DataOps maximizes process change and organizational realignment to smoothen data management by everyone who has access/handles data. DataOps connects the dots between data collection and preparation. It calls for a democratized atmosphere where data infrastructure is centralized and available to all stakeholders. It requires crossing organizational and cultural barriers that separate data and people and bringing the two data audiences together. On one side we have the data operators or people who are responsible for infrastructure, security, etc., and on the other side we have the actual consumers of data, the people responsible for using data to drive change, such as data scientists. What DataOps does is, it brings these two audiences together eliminating friction points. DataOps focuses on governance, operations, delivery, data transformation and version control. In a complex technology landscape of legacy systems and cloud solutions, DataOps will help to leverage the right technology for the right solutions, and reduce friction.

To make this model work it is important to not treat data scientists as separate from the end product. They become part of the team to analyze, question and identify the datasets that need to be analyzed. This information can then be handed over to the data team. There are a lot of considerations to made when implementing a DataOps model.
Already, there are several DataOps models making the rounds and organizations are exploring this methodology.

To summarize, DataOps will help to ensure that models which perform well in a lab, perform the same way in production.

Artificial Intelligence (AI) and Its Impact on Software Testing

Enterprises impacted by ‘Digital Disruption‘ are forced to innovate on the go, while delighting customers and increasing operational efficiency. As a result, software development teams who are used to time-consuming development cycles do not have the luxury of time any longer. Delivery times are decreasing, but technical complexity is increasing with emphasis on user experience!

Continuous Testing has been able to somewhat cope with the rigorous software development cycles, but keeping in mind the rapid speed with which innovation is transforming the digital world, it might just fall short. What is therefore needed is the ability to deliver world-class user experiences, while maintaining delivery momentum and not compromising on technical complexity. To meet the challenges of accelerated delivery and technical complexity requires test engineers to test smarter instead of harder.

So what has all this got to do with Artificial Intelligence (AI)?

The fact is, AI and software testing were never discussed together. However, AI can play an important role in testing and it has already begun transforming testing as a function and helping development teams to identify bug-fixes early, assess, and correct code faster than ever before. Using Test Analytics, AI-powered systems could generate Predictive Analytics – to identify specific areas of the software most likely to break.

Before delving into AI-based software testing, it might be good to understand what AI actually means. Forrester defines AI as “A system, built through coding, business rules, and increasingly self-learning capabilities, that is able to supplement human cognition and activities and interacts with humans natural, but also understands the environment, solves human problems, and performs human tasks.”

Related: Improved time to market and maximized business impact with minimal schedule variance and business risk

AI is providing the canvas for software testing but its uses have to be defined by testers. Some engineers have already tested their imagination and they use AI to simplify test management by creating test cases automatically. They know that AI could help to reduce the level of effort (LOE) while ensuring adherence to built-in standards.

AI could also help to generate code-less test automation, which would create and run tests automatically on a web or mobile application. AI-based testing could identify the ‘missing requirement’ from the Requirements document, based on bug-requirement maps.

Machine learning bots are capable of helping with testing especially with end-user experience taking the front seat in testing. When trying to understand the role of bots in software testing, we need to bear in mind the fact that most applications have some similarities, i.e. size of a screen, shopping carts, search boxes, and so forth. Bots can be trained to be specialists in a particular area of an app. AI bots can manage tens of thousands of test cases when compared to regression testing which can handle much lesser numbers. It is this ability of AI testing that elevates its importance in the DevOps age where iteration happens on-the-go.

To summarize, while bots do some of the routine stuff, testers can focus on more complex tasks, taking the monotony out of testing and replacing it with the word ‘exciting’.

Learn more about Trigent’s automation testing services.

Read Other Blog on Artificial Intelligence: 

The Impact of Artificial Intelligence on the Healthcare Industry

To Opt or Not? Can Traditional Industries Use Machine Learning to Garner Business Insights?

Machine learning is a scientific discipline that uses algorithms to learn from data instead of relying on rules-based programming. It works in three stages, i.e. data processing, model building & deployment, and monitoring, with machine learning binding the three together. The power of machine or deep learning cannot be underestimated and as Alexander Linden, Research Vice President of Gartner says, ‘Deep learning can give promising results when interpreting medical images in order to diagnose cancer early. It can also help improve the sight of visually impaired people, control self-driving vehicles, or recognize and understand a specific person’s speech’.

To Opt or Not

Traditional industries have many processes which are governed by rules-based software. This approach is limited in its ability to tackle complex processes. If the rules-based learning can be substituted with self-learning algorithms, then valuable patterns and solutions would emerge.

As a result of digital data and Internet of Things there is a proliferation of data. If you believe this data will help you make intelligent decisions based on patterns, add machine learning. There is no need to add it otherwise as it can make an existing business complicated. Starting with the smaller pieces of the puzzle is better than jumping into it head on. For example, one can collate information from regular reports, apply machine learning to forward-looking predictions.

Machine learning can be useful to detect anomalies, enhance customer services and recommend new products. Manufacturing companies, for example, can benefit from machine learning by self-examining videos where defects can be spotted and automatically rerouted.

Recent developments in machine learning suggest a future in which robots, machines, and devices will be able to operate more independently if they run on self-learning algorithms. This would have far reaching effect in terms of improved efficiency, and cost savings.

Related: Reshaping your business with AI

Machine learning works best on specific tasks where input and output can be clearly stated. If an organization has a sufficient amount of data, with enough variation, machine learning can produce meaningful approximations.

Finally, it is the technical barriers that become the biggest hurdle in the transition process. To address the actual challenges and the perceived ones, companies need to identify expert data analysts who are capable of developing the intricate algorithms that machine learning requires. It will also require a team of engineers who can provide strategic direction, manage quality, and train internal resources on the tool.

Why QA Offshoring Pays Off with the Right Strategy

Here’s a heads up on outsourcing testing in a DevOps world.

Digitization has disrupted existing business models, processes, and strategies, but some factors continue to remain constant, i.e. cost, time-to-market, and experience. Of the three, customer experience has taken precedence, making quality assurance a non-negotiable constant.

Over the years, quality assurance has been associated with different implications including certifications and functionalities. Today, quality assurance is all about people or customers and their personal experience with a product or solution. This makes sense when we view the world from the perspective of the Internet of Things and digital transformation, where personal experiences define a brand’s efficacy. QA organizations, therefore, are evolving to include social and psychological impacts to value delivery. Value delivery includes cost, and time saved where QA plays a pivotal role in a project from its requirement stage to ensure that non-productive time spent in the last phase for testing is eliminated.

In its new avatar, QA is instrumental in developing and launching a successful project. QA is built into all aspects of a project and instrumental in process improvement as well as defect management in agile testing, security testing, accessibility testing, performance testing, and user acceptance testing. In an agile environment, the mantra is “test early and test often”.

Related: Improved time to market and maximized business impact with minimal schedule variance and business risk.

QA is an integral part of a project, from reviewing user stories with business analysts, as they are created to ensuring that they meet the ‘testable’ criteria. They create test cases well before the start of a sprint to facilitate a test-driven development (TDD). They will work side-by-side with developers and will be responsible for the entire deployment of the quality assurance environment.

With QA playing such a pivotal role in a project, the question which often arises is, ‘How would offshoring QA work? Will it be beneficial? And, what does it actually involve?’

Most companies that have their in-house or software development arms, will have a testing team in place. QA cannot be confused with this team. In the new world, quality assurance requires skills and experience which can only be met by seasoned QA professionals whose job is to focus on delivering best-in-class solutions.

Secondly, the cost and quality of software testing programming language can affect the overall cost of a project. In some cases, it is estimated that testing can cost up to 40% of a project’s overall cost. Intangible costs can include poor test execution, lowered customer satisfaction, higher operating costs, and increased business risk. CIO’s, therefore, at least in the last decade or so, have opted to offshore their testing work to save on both tangible and intangible costs.

Offshoring QA especially makes a lot of sense for SMBs with tight budgets. These companies normally defer QA to the end which results in a less stable product. Having a team that works within the budget makes more sense. In this model, the core team will be retained on a long-term basis and a flexible team can be added/reduced based on the ebb and flow of the project. This option provides faster ramp-up and flexible ramp-down of testing resources.

Also, the follow-the-sun model makes great sense for offshore testing. Imagine a team working on functionality in the day and another team across the world testing it the same day. The hours saved add up to make complete business sense.

When considering strategic business management off-shored quality assurance in software testing can result in better quality applications, reduce business risks, and improve existing critical testing processes. Having said that, the key to success is in finding the right offshoring partner. Companies that are considering offshoring their QA, along with looking at cost savings must look at the competencies and capabilities of the partner company.

Some critical factors to be kept in mind when considering offshoring are:

  • Cost-Efficiency

More often than not, the word offshoring is considered synonymous with cost savings but to actually reap the benefits of saved budget and time with exemplary results requires rock-star testers and not a pool of untrained workers offering cheaper rates.

  • Industry Experience

As we already know each industry is different and has its own unique business processes.  Having a team of great testers with no clue about a business will only end up slowing down the testing efforts. One must choose a team of QA professionals with strong industry knowledge to ensure that the areas with the highest level of business impact get the highest testing priority.

  • Technology Frameworks and Best Practices

QA professionals should ideally have some unique intellectual property and best practices that they bring with them. A team that has successfully completed multiple projects will have a set of best practices, accelerators, methodologies, and tool kits to accelerate the testing efforts and reduce time to market.

  • Cultural Fit

When considering offshoring, especially of testing services, cultural fit becomes paramount to the project’s success. Cultural fit is acquired only by working with partners who have managed projects in the geographies under consideration. It is only with experience that an offshore team can communicate, work at the required pace, and deal with issues as and when they arise. In addition, if you need a large managed service, it is also important to have an on-site lead to ensure accountability.

  • Agile is important

The role of testing in agile practices is already recognized and yet several organizations struggle to integrate testing and quality into their agile delivery methods. A partner who understands how testing ‘fits’ into the development effort will work to resolve problems on the go to ensure that the product is not delayed.

To summarize, when identifying a partner, a company must trust the partner’s suggestion on the engagement model. The QA partner should be able to mobilize people, knowledge acquisition, infrastructure, and processes. The next step would be for the independent QA and testing team to integrate seamlessly with the project team. The QA team should have the strong industry knowledge and translate this into the user experience. This knowledge will define the project’s overall flow.

Keeping these critical factors in mind, and with a well-defined strategy in hand, a successful QA offshore engagement is possible. Add one more ingredient, i.e. trust, and the project is set up for success.

Big Data Analytics Can Play an Important Role in Healthcare

Before you dive into the importance of big data analytics in healthcare, you can learn about the importance of small data vs big data in healthcare.

Global healthcare is in a state of flux with big data analytics emerging as a powerful tool to transform clinical, operational, and administrative functions among others.  The healthcare IT market has grown from basic EMR solutions to specialized hospital information management solutions and healthcare information exchange systems and the Healthcare IT Solutions Market Report predicts growth at a CAGR of 13.57 percent till 2022.  This growth is being fuelled by the increasing role played by big data to manage patient care, reduce costs, and improve quality while keeping one eye steadily focused on operational efficiency.

Related: Innovative Healthcare solutions for ISV’s & Providers

Realizing the potential of big data

Probably realizing the value of big data, The Health Information Technology for Economic and Clinical Health (HITECH) Act created a $30 billion federal grant as an incentive to adopt EHRs, which has helped to generate tons of structured and unstructured data.  This data is finding its uses across functions and services.

Value-Added Services

Insurance companies, for example, are changing their models from a fee-for-service to value-based data-driven payments by using electronic health records that enable high-quality patient care. In the value-based model, doctors, hospitals, and insurance work together to deliver care that is measured by patient satisfaction and this model relies on data from EHRs.

Cost Savings

The same data from EHRs have also helped in mitigating fraud thereby increasing cost savings.  For example, the Centers for Medicare and Medicaid Services prevented more than $210.7 million in healthcare fraud in one year alone.  Insurance companies have also experienced a higher return on investment.  United HealthCare generated a 2200% return on investment in a single year.  Big data analytics has helped these companies to take large unstructured information with regard to historical claims and by using machine learning to detect patterns and anomalies.  This has helped to control overutilization of services, patients receiving the same services from multiple hospitals, and filling out identical prescriptions in various locations.

Predictive Patient Care

By analyzing structured and unstructured data, and using predictive modeling on EHR data, it is now possible to diagnose various illnesses which is helping to reduce mortality rates.  To elaborate, devices are helping to monitor patients’ glucose levels, blood pressure etc.  When combined with machine learning IoT, proactive care for patients is a reality.  Advanced big data analytics is able to work with the unstructured data generated by these sensors.

To summarize, evidence-based medicine relies on patient data which is now growing more in availability.  Capturing data is, however, only the first step.  The next one requires analytics which will not only result in better patient care and engagement but also eliminate redundant testing, reduce expensive errors, and help save lives.

Single or Multi-cloud?

A recent study by 451 Research indicates that nearly a third of large organizations work with four or more cloud vendors, making one wonder whether multi-cloud is the future of cloud computing. The recent acquisition by Google of Orbitera, a platform that supports multi-cloud commerce, shows that Google recognizes that multi-cloud environments are the future. In a market estimated by Gartner to be worth $240 billion next year, multi-cloud creates a new front in the so-called “cloud computing wars.” This can only be good news for those businesses looking for flexibility, cost savings, and ultimately better solutions.

It appears that organizations that prefer multiple cloud providers have very logical reasons for this. They use multiple cloud providers to support specific applications or workloads. For example, a core application may need more resilience to perform when power is lost or expand to capacity and another department within the same organization may need the cloud to enhance productivity. Having one single cloud solution may compromise its outcome, which is probably why large companies with multiple functions may end up with several clouds. Another reason, as per a report by Ovum, seems to be overall dissatisfaction with a single cloud service provider. Key reasons cited include poor service performance and a lack of personalized support.

Related: We build impactful cloud solutions that solve challenging business problems.

One more reason could be that companies vary of keeping all their applications and workflows in one single cloud, because it can leave them vulnerable and reduce their pricing negotiation powers with the provider, in the long term.

While the logic behind a multi-cloud environment may seem sensible, the fact remains that it can be difficult to jostle between clouds. While cloud providers make it easy to move applications to their platforms, leaving it is not easy, to ensure that their business is not reduced to a price-sensitive commodity.

Also, some organizations are worried about the downtime involved in moving petabytes of data

Many organizations are rightly concerned about the downtime involved in moving petabytes of data between cloud providers. Fortunately, the same patented Active Data Replication technology that all the major cloud vendors offer to make it simple for customers to move to the cloud can also be used to migrate data between the clouds.

The ramifications of this are huge. While Amazon Web Services (AWS) remains the dominant player in the space, businesses wanting the freedom to juggle multiple cloud services and avoid vendor lock-in may well help the other players to catch up.

Comments welcome!

Why Businesses Cannot Afford Software Glitches

In October last year when a denial-of-service (DOS) cyber attack on the DNS provider made many internet platforms and services unavailable, people realized how hopelessly reliant they have become on the internet. However, today businesses have a lot more to worry about as they watch and read about software system glitches. Very recently, some Starbucks stores were closed due to computer system outage because of POS glitches causing caffeine withdrawal throes for customers and probably a strong impact on revenue figures for Starbucks.

In another example, improved technologies in the banking sector have failed to stem the rising tide of fraud in the US, according to a study by analytic software firm FICO. Instead of hiding glitches, more businesses are ready to talk about their issues. For example, explaining a 12% drop in same-store quarterly sales, Rent-A-Center’s CEO Robert Davis said, “Following the implementation of our new point-of-sale system, we experienced system performance issues and outages that resulted in a larger than expected negative impact on core sales. While we expect it to take several quarters to fully recover from the impact to the core portfolio, system performance has improved dramatically and we have started to see early indicators of collections improvement.”

Related: All that you need to know about avoiding software glitches

Similar to the cases mentioned above, software failure can have very serious consequences for businesses which rely on their software systems to keep their businesses up and running. It can stop production, interrupt processes and ultimately lead to financial loss. While we must acknowledge the fact that end-to-end software systems are vital to organizations, their advantage comes with the risk factor. Risk management is therefore key to avoiding software glitches. Research indicates that the number one cause of software failure is human error in application development or programming. With the prevalence of human error, it’s unavoidable that some software will deploy with bugs and errors that slip through the cracks during development. Business leaders may not have control over the source code or the development process, but it is possible to take some steps to prevent software malfunction — and to identify potential problems before they can cause interruptions in  day-to-day business. Talk to us to know more.

Cloud Transformation Requires More than a Mere Strategy

The Cloud Industry Forum (CIF) reports that more than half of businesses in a research of 250 large enterprises say that they are steering away from the cloud as they do not have the digital skills to make it a success. Summarizing the research, Alex Hilton, CEO of CIF says that having the right skills in the broader workforce is critical for digital transformation. In this research, only 45% of respondents felt that their companies had the right skills required to adapt to digital transformation. Another 15 percent are actively recruiting while 30% of companies have no immediate plans to recruit.

The research underlines the fact that while organizations are acknowledging the power of the cloud, they are not sure that they have the right skills to bring about the transformation.  This could be because cloud transformation has layers of complexity to it. Existing systems, information silos, security, compliance and regulatory requirements, internal buy-in and service continuity are some of the key factors which make cloud transformation difficult to execute. Add to this the cost factor, business disruption and unclear advantages and cloud transformation just got blown away.

Digging a bit deeper, it is apparent that along with finding people with the required technical skills to facilitate digital transformation there is the need for an alignment of IT with business needs. While large cloud providers and training organizations are trying to address the need with training courses, there is still an underlying deficiency. To simplify, training programs lean towards particular platforms or sets of technologies. Mostly technical in nature, they don’t encompass the business, security or compliance aspects of digital transformation.

In fact, cloud transformation cannot be generalized. It is unique to every business and needs to relate to its specific requirements. It has to take into consideration various business specific requirements and map these to industry regulations and compliance requirements. It has to consider the existing systems and the viability of migrating these to the cloud. It has to look at the business holistically to formulate the strategy. And of course, there is then the need for technologists to execute the migration and a strong testing team to ensure 100% effectiveness.

Related: What you need to know about Cloud Transformation to exceed customer expectations

Those companies that have followed this process find that their cloud transformation has been a success. Their reliance on outside help to bring about the transformation is one of the key reasons for the success. So even as companies invest time and effort in training their existing IT staff in cloud technologies, they need to look at partners who have specific experience in their industry domains.

Trigent understands unique business requirements, assesses current investments in technology and applications, and helps create roadmaps. Trigent’s engineers are experienced in SOA/web services that allow mash-ups and integrations that permit access from new mobile and/or cloud applications. For applications that can be modernized, Trigent creates a strategy for modernization, such as “lift-and-shift”, complete re-architecture or anything in between. By establishing a phase-wise project plan, reinforced with industry best practices and structured processes, the migration process is sure to be be well planned, executed and supported.

To know more about how to transform your applications to the cloud, visit: Cloud Services

3 Compelling Reasons Why CIOs Resist Digital Transformation

A decade ago, businesses focused on data mining, search technologies, and virtual collaboration.  Many of them did not have a mobile strategy and social media was hardly leveraged to advance business goals.

Fast forward to 2017 and most technologists talk about digital transformation, artificial intelligence, Internet of Things and machine learning.  However, if digital transformation is gaining mass popularity, why is there some resistance to its overall acceptance as the new normal?   A March 2017 study by Point Source titled `Executing Digital Transformation’ found that although companies planned to spend $1.2 trillion on digital transformation in 2017, less than half (44 percent) of IT decision makers are extremely confident in their organization’s ability to achieve the vision.  According to the report, many of the roadblocks relate to organizational structure and culture. The fear of the unknown is also rooted in the ambiguity behind digital transformation.

Here are the top 3 reasons why CIOs resist digital transformation:

The ‘mostly unknown’ factor

Technology is used to create and install solutions for businesses.   In the case of digital transformation, businesses have to start by examining their existing processes, look where improvements can be made, and identify weak links and dependencies in the system. However, with regard to existing processes, if everything is in place and working, why do we have to invest time, energy and money in cross-examination?  What can the value-add be in this transformation?  Without a clear idea about ROI, the need to change is met with resistance.  IDC’s 2017 predictions for digital transformation and for CIOs, confirms that only 40% of CIOs will lead the digital transformation of the enterprise by 2018. 

“Working fine” processes and systems

Those companies that have succeeded in creating a digital value proposition, have a clear view with regard to how they will exceed customers’ digital requirements. They also have realistic time-frames and budgets and this helps them to visualize the outcome.   The gaps that arise in terms of unmet customer expectations related to technology and the processes that are delaying in achieving their goals is where digital transformation can help.  All this requires a futuristic vision instead of a preference to go along with what is already working.  Resistance to change is the biggest hurdle in the way of innovation acceptance.  This resistance is stronger if existing processes and systems are at an overall level achieving business goals.

`Where to begin with’ data

All organizations have tons of data. The challenge for many is the unstructured nature of their data.  Some of them have siloed systems,  containing bits and pieces of data.  In the absence of a centralized information warehouse, there is no clear idea about what they do want to accomplish from this data.


The challenges are more in the nature of spring cleaning, modernizing and following a structured process oriented way to run the business.  Whether we call this digital transformation or anything else, we need to be able to look beyond the challenges at the advantages, knowing that our competitors have probably already begun their digital transformation journey.  As a  February 2017 research from McKinsey succinctly shows, companies that get digital transformation right, win market share, and those that don’t actually have a negative ROI for their investments.

Why Blockchain Technology is Disrupting the Healthcare Industry

Healthcare records, as of now, remain disjointed because of the lack of common architectures and regulations that permit the safe transfer of data between stakeholders.  For example, an updated patient’s clinical data is stored in a database within the hospital or within a defined network of stakeholders.

As a paradigm shift in information distribution, Blockchain technology’s potential in healthcare can be groundbreaking.  Imagine a limitless database, with centralized ownership, where all members cryptographically add, manage and access data. When new data is added all the members on the network are notified.  For patients, this can be their latest medical diagnosis.

To drill down to the specific case of a hospital, Blockchain technology offers an easy way to manage patients’ records where multiple entities are involved.  If a patient, for example, decides to continue his treatment in a different place, the original hospital records need to be electronically transferred or carried as physical documents by the patient and this could be cumbersome and data could be outdated. However, Blockchain, behaving as a single source of truth, would allow the healthcare provider anywhere to have an immediate and accurate view of the patient’s current medications.

The above is only a small example of where Blockchain can benefit the healthcare industry.  There are several areas where it can make a positive impact such as clinical data sharing, public health information, clinical trials, administration and finance departments and patients’ personal data.  To elaborate, Blockchain-enabled IT systems can make a huge difference to clinical data exchange, but on a more transactional level, it can help with claims and billing management.  Research indicates that nearly 5 to 10 percent of healthcare costs are fraudulent and in the year 2016 this resulted in a loss of $30 million in the United States.  By automating the claims and payment processing activities, healthcare companies can reduce administrative expenses.

While the healthcare industry is beginning to acknowledge Blockchain’s power, many fear that the technology’s strength could itself be its Achilles’s heal, i.e. its decentralized approach places it in a position of security vulnerability.  Secondly, Blockchain works on the principal of unique identifier links. If the same person has multiple IDs then this duplication needs to be managed before it can be added to the blockchain.

Related: What you need to know about leveraging Blockchain technology for the healthcare industry

But these transactional challenges can be tackled and technology companies are already working on coming up with solutions.  The believers, for example, say that Blockchain actually helps to enhance security and reliability.  As per the Protenus Breach Barometer report, there were nearly 450 health data breaches in 2016 affecting over 27 million patients.   It is also estimated that 27% of the breaches were caused by hacking and ransomware.  The believers say that with the growth in IoMT (Internet of Medical Things) ecosystems, Blockchain-enabled solutions will not only bridge the gap of device data interoperability, it will also be more secure. For example, hacking one block in the chain is impossible without hacking every other block in the chain’s chronology.


The blockchain is based on open source software, commodity hardware, and Open APIs.  These components ensure faster and smarter interoperability and help to reduce the burden of handling large volumes of data.  By using industry standard data encryption and cryptography technologies, health care companies can ensure compliance and security. Blockchain data structures can support a wide variety of data sources including patients’ mobile devices, wearable sensors, EMR’s and so forth.  With built-in fault tolerance and disaster recovery, data remains protected.

To summarize, Blockchain technology has carved a niche place for itself in the healthcare IT ecosystem and in the future, Blockchain could remove all blocks in the way of advanced precision healthcare.

The Impact of Artificial Intelligence on the Healthcare Industry

Artificial Intelligence (AI) is predicted to play a game-changing role in patient care. Let’s take a small example of its help in medical diagnosis. Imagine a scenario where a patient walks into a doctor’s office with symptoms indicative of several possible illnesses.  The doctor, to be sure, consults a digital assistant which scans a global database and comes up with a solution based on deep data analysis. The doctor goes on to prescribe further tests to confirm the prediction,  and here too, machine learning helps with comparing the images to the database and confirms the most likely cause of illness.  The doctor has just hastened patient care and with the help of accumulated intelligence has diagnosed the case. Not stopping there, the doctor introduces the patient to a chat-bot that explains the disease and its treatment. It schedules follow-up visits as well as any further investigations, if required. AI has just proved how invaluable it can be in patient care, by shortening the diagnosis to treatment curve.  Where time is of the essence, AI has proved how invaluable it can be.

Machine learning has brought AI to the forefront of healthcare and it is likely that its impact on diagnosing and treating diseases will be unsurpassed.  Recognizing this trend, a 2016 study by Frost & Sullivan, projects AI in healthcare to reach $6.6 billion by 2021, a 40 percent growth rate.  The study further confirms that AI will enhance patient care delivery by strengthening the medical imaging diagnosis process.    As an industry disrupter, AI will create real value for patients by supporting prevention, diagnosis, treatment, management and drug creation.

Technology experts predict that in the next couple of decades AI will be a standard component of healthcare – augmenting and amplifying human effort.  Its role will be as impactful and as quiet as the common X-ray machine.  It will also automate several health care tasks that are time-consuming and which require tons of unstructured data to be converted into intelligence.

While some of the innovations that we are talking about are futuristic in nature, AI has already quietly infiltrated this industry. It is already being used by healthcare players to manage billing, appointment fixing, and logistics planning.  To move into core clinical areas requires an amassing of data and that too has already begun.  With quantifiable data, diagnostics will become accurate and as a result indispensable in medical treatment.  Does this mean that we will see robot doctors in the place of human medical professionals?  Let’s leave that to science fiction movies for now.  What is more likely to happen is AI-enabled medical professionals.

To summarize, we can only imagine AI’s impact on saving human lives, going forward. For example, just imagine people in remote areas with limited access to diagnostics.  AI has just helped the local medical professional to remotely prescribe treatment, deliver medicines through an automated delivery system and prescribe telemedicine.  In a way, it has just helped to shrink the world.

Technology companies focusing on the healthcare segment are investing in Centers of Excellence where AI empowered healthcare IoT will bring about some dynamic changes, not to mention better control over existing processes such as supply chain, inventory management, equipment management, invoicing and drug development and reduce latency, lower cost and deliver operational efficiency. At Trigent, while we solve the problem of productivity, we remain focused on helping healthcare organizations take care of more people with less resources.  We do this by tapping our knowledge, experience and expertise in data and machine learning.

Exit mobile version