Challenge This global, nonprofit healthcare organization provides public healthcare programs to underserved and vulnerable populations, including Medicaid, Medicare, and Marketplace. Due to the nature of their business, they have a unique and significant demand for information. Their information management group recently began a vital initiative to implement a data supply chain strategy to develop more mature business practices that leverage their data and insights. They implemented a strategy that gathers data from hundreds of source systems and builds a Hadoop data lake to serve their data warehouse and analytical requirements. However, the current execution was not achieving their goals — both the strategy and delivery of data to the organization were falling short of timeliness and quality expectations. Recognizing the need to change their strategy and implementation, they reached out to Fusion to partner with them in creating a data solution that would ensure the quality and timeliness of their data and subsequent analytics. Their goal was to implement a data supply chain delivery model that would accelerate the availability of high-quality data for their business users. Based on business needs and regulatory requirements in the healthcare space, they needed to ensure good data quality throughout the entire data flow and create a data governance model for alignment with the different departments and federal and state compliance. Solution Our first step was to complete a current state assessment that collectively addressed all aspects of their data supply chain strategy. We identified opportunities to address each challenge and provided prioritized recommendations for execution through this assessment. These recommendations included: Establishing a scaled agile program management office Optimizing the delivery, data quality, and metadata processes under an Agile framework Leveraging data integration technologies to maximize delivery Realigning the data architecture We outlined these recommendations in a roadmap that established the program operating model and realigned their architecture to create the foundation for confirmed delivery. After developing a data lake in Hadoop that collects all data from a source to support diverse use cases, we developed an enterprise metadata strategy. In this strategy, all the data is accompanied by the many forms of metadata that better serve business stakeholders and technical users. And looking beyond their initial needs for BI and analytics, the solution also supports more diverse use cases, including PHI, security, and compliance, with proper data quality throughout the data flow.
Digital solutions about Data
Challenge A state agency governing child services, programs, and policies was undergoing a significant transition and wanted to modernize their technology and data platforms to better fit their new goals. As the agency worked to transform some of their core agency functions, questions arose regarding their data landscape, specifically surrounding architecture for delivering traditional reporting and BI and analytics in a modern landscape. The agency was focused on one primary business case — enabling more proactive case management. Case management impacts stakeholders statewide and is highly regulated compliance with federally mandated timeframes. Their goal was to take their regularly updated raw data from their operational environment and bring desired analysis to life, resulting in insights they could use. The agency decided that they wanted to use Amazon Web Service (AWS) as a platform, but they had limited code developer resources skilled on the AWS platform. They were also dealing with significant bottlenecks created by process issues and backlogs at different levels of the agency and other departments involved. Fusion collaborated with Amazon to help the agency envision and showcase the capabilities of AWS to deliver a modern data platform that would integrate seamlessly into their ecosystem and build a foundation for the future. Solution First, we completed an assessment of the current data landscape and processes to enter and analyze data. This included: Identifying child support cases within the state that had exceeded federal timeframe compliance rules and cases that were approaching the federal deadlines Pinpointing bottlenecks in processing at the agency and caseworker levels that were leading to compliance issues Understanding where and how data was currently being entered and utilized throughout the agency We created a data model design that supported BI analysis of current cases and an analysis of trends over time. Given the limited developer resources for AWS within the agency, they needed a solution that would allow them to transition their current skills and extend additional skills capabilities effectively. Based on the model design for their immediate use case (case management), we architected a solution to integrate different subject areas from their core systems and provided a modern data reference architecture to guide platform migration — all implemented in the AWS cloud environment. The solution included standing up native AWS services including (IAM, VPC, S3, Lambda, Glue, Glue Catalog, Aurora Relational Data Service (RDS), Redshift, and QuickSight) to: Provide an S3 landing zone for the semi-structured extracts of full and delta data from their operational systems Establish raw (S3), curated (RDS Aurora-PostgreSQL), and enriched (Redshift) environments for the data as it is processed from a transactional arrangement of string inputs into a strongly typed and structured format for reporting Design data models and build databases in each of the environments Demonstrate additional AWS-specific capabilities for future use cases involving volume growth
Challenge State agencies generate a massive amount of data. This Midwest state generates about 4 petabytes of data, spread across 1,600 databases and 120 agencies, boards, and commissions, and its governor recognized a need to leverage all this data. He tasked an agency to unlock the power of the state’s data resources by removing the siloes and leveraging advanced data analytics. Although a massive amount of data was available, it was extremely underutilized and used almost exclusively for traditional reporting with no analytics within each agency. Each agency had access solely to their data, and there was no precedent or process for sharing data across agencies. Without insights from data analytics or data from other agencies to create a more holistic picture, each agency was left stifled and unable to use even their own data in a meaningful way. And although the solution might sound simple — “just share the data!” — it is far more complicated than that, especially for the highly regulated industries within the state government. Privacy concerns and implications of data sharing were huge concerns that also had to be addressed and had kept the state from pursuing similar initiatives in the past. The state was also burdened with the reality of its current technology landscape, as these agencies are not known for having cutting-edge technology platforms and highly customized solutions. Instead, they were saddled with legacy platforms with long-term contracts or uphill battles to institute change. Although there were significant roadblocks to consider, the governor wanted to push forward. He maintained that unlocking data was imperative to allow the state to identify and drive meaningful social change and tackle complex problems facing residents’ health, security, and well-being. This agency engaged with Fusion to help establish a data sharing and analytics platform that would allow the state to unlock its data’s full potential through analytics. Solution We worked with the state agency to develop a robust data-sharing platform that would allow the agencies to share data securely. We created a big data platform that allowed the different agencies to share resources, tools, and commons services to execute their analytical use cases. Developing the platform included: Enabling platform administration services for care and feeding of the platform Establishing data governance processes and policies, including data lineage, metadata management, data quality assessments, and usage monitoring Defining a data governance program Configuring security services to address concerns around security and privacy, including authorization, authentication, auditing/monitoring, and encryption Developing deidentification capabilities to protect critical data Defining the framework that allowed for shared capabilities for ingestion and analytics, along with the flexibility to enable agencies to integrate uniquely with their internal systems Establishing policies, procedures, and data sharing agreements to allow agencies to best manage their data Defining policies to enable agencies to launch and solicit new projects Defining the lifecycle for onboarding data that includes planning, ingestion, governance, processing, enrichment, and consumption Socializing, evangelizing, and consulting with different agencies to help them leverage this analytics platform with their use cases
Multi-billion dollar annuity org implements new modern data platform to keep up with looming Baby Boomer retirement trends Challenge Statistics show that in 2031, the U.S. population over the age of 65 will be around 75 million — almost double what it was in 2008. This national retirement services organization recognized that a generation of this size transitioning out of the workforce would significantly impact the economy and asset management. Our client understood that the migration of the Baby Boomer generation into their retirement years would instigate a significant transfer of wealth — shifting the balance from asset accumulation to asset withdrawal. Additionally, as wills are executed for this generation, asset wealth will begin to shift to the next generation, one that is more digitally focused, with different needs and increasing expectations for customer experience and engagement. They realized that to better serve transitioning and emerging clients, they needed a more integrated approach to comprehensive data about their clients, products, and beneficiaries. But it wasn’t that simple. Their data assets were scattered across the organization’s siloed business systems and disintegrated data warehouse and reporting assets. They needed a partner to look at their entire data landscape, assess its current state, and provide strategic and actionable next steps to meet their new business needs. As a well-established, multi-billion-dollar organization, they had numerous legacy systems, a highly fragmented data asset landscape, and long-term managers not ready for change. Their main challenges were: A lack of integrated architecture for data across the diverse systems landscape Redundant sources of reporting and analytics data Mistrust of their data resulting from a lack of data integrity and inaccuracies across different platforms Challenges with data analytics due to difficulty accessing applicable integrated data After over a decade of working together on different initiatives, they turned to Fusion to provide the assessment, framework, and roadmap they were looking for. Solution With an understanding of their goal — to better serve different client populations and prepare for changes in asset management — and the current roadblocks, our team got to work with our customized three-phased approach: Analysis Program strategy alignment Maturity assessment and program roadmap Our team began by conducting a data analysis to assess their data and analytics environment and identify and prioritize business needs and drivers. We provided a data charter that included a full inventory of their strategic business initiatives and use cases, an outline of their objectives and program goals, and a list of the identified challenges and barriers. With the assessment and analysis complete, we created a cohesive strategy to realign their data management and analytics capabilities with their business objectives, which included: Prioritization of strategic business initiatives Key milestone targets for program increments Capability gaps to address We also completed a current state data and analytics maturity assessment and worked with the client to understand where they wanted to be on the data maturity scale. From there, we were able to build an integrated, multi-year roadmap to address master data challenges. We then designed and built a modern data platform that would support data science, management reporting, and executive dashboards — all with integrated data that was accurate and reliable.
After an intensive POC process, this Oil CoOP landed on Azure and Snowflake for a Modern Data Platform Challenge This leading Midwest oil refinery cooperative must manage many business processes to successfully deliver fuel and oil, oversee pipelines, and manage retailers. And to do so, they were relying on point solutions from multiple vendors for many of their operational needs. They had data coming in from these disparate solutions but no enterprise data management platform to integrate the information and support accurate business intelligence reporting. But they were not just looking to integrate their data; they wanted to elevate their data landscape through advanced analytics and machine learning to improve their business prospects like price optimization and targets for profitable oil exploration. Their primary goal was to evaluate which platforms would best support their business needs and provide scalability and efficiency for their anticipated use of the platform. We had done previous work for this client, so when the time came to integrate and modernize their data landscape, they reached out for help. Solution Modern Data Platform Evaluation The first step was to assess the client’s current data landscape and state of applications and their demands for data integration based on business needs. We looked at their business requirements for data and functionality, from the key select sources including WolfPack, Guardian, and select unstructured manual data sources, and organized the data into subject areas to serve as a semantic layer for all data available. With an understanding of their current state, we then began evaluating viable solution options for a target modern data platform. Based on feedback from the client, we presented two options with a focus on cloud implementations that support BI and analytics — an Azure-based data architecture and a Snowflake Data Cloud solution. Then we provided the client with a complete assessment deliverable that included our findings of their current state and viable options for the modern data platform, including: An outline of solution options, components, benefits, and impacts An outline of the data architecture Estimated costs for development and deployment Anticipated operational costs Side-by-side comparisons of key features and considerations of the two options We also provided the client with a high-level roadmap of prioritized next steps, which included the creation of POCs to test the business use cases against. Modern Data Platform Proof of Concepts (POCs) Armed with the information from our initial assessment, the client asked us to execute a combination of POCs to demonstrate how these modern Azure-based cloud platforms and technologies would support the company’s BI and analytics needs. To create the POCs, we defined the modern data platform reference architecture within Azure with options for specific technologies, including native Azure components (e.g., Azure Data Factory, Azure Data Lake, Azure SQL Data Warehouse, Azure Analysis Services, PowerBI, and Azure DataBricks), and Snowflake. Building the POCs included: Configuration of Azure subscription and services Design of structures in Azure SQL Data Warehouse Creation of tabular model in Azure Analysis Services Execution of data pipelines into Azure and Snowflake Creation of a Snowflake database & designated views Build out of data pipelines between Azure services and Snowflake Creation of analytics model using Spark with Azure DataBricks Once the platform was created, we demonstrated BI reporting using PowerBI against the POC platforms and the execution of machine learning use cases using Azure ML studio and custom model development. We then tested multiple use cases for data ingestion, BI, and analytics. We validated various use cases for viability, including: Financial reporting using Dynamics AX data that is integrated with budgeting and pricing data Supply chain optimization Predictive maintenance
Can a bracelet really save your life? See how wearable devices resulted in improved care quality and patient outcomes for some of our most vulnerable populations. Challenge Nursing homes have consistently struggled with staffing ratios, and it has only worsened with the pandemic. With a caregiver shortage, nursing home administrators are having to find new ways to keep their patients — a vulnerable, elderly population — safe. That’s where BioLink Systems came in. Nursing homes need to constantly monitor patient vitals to ensure they are providing the appropriate care and can intervene as quickly as possible if there is an issue. But that can often be difficult with large patient loads and patients that sometimes cannot communicate their needs to staff. And with an elderly population, things can spiral out of control quickly if they aren’t caught in time. By constantly monitoring a patient’s vitals, nursing staff can address concerns quickly and mitigate potentially catastrophic events with the patient. BioLink Systems had initially created a device that could attach to an adult brief to monitor urination levels and the patient’s body position. However, they quickly began experiencing issues with their prototype and realized that although it demonstrated their capabilities, it was not ready to take to production. They needed to fix these issues quickly, so they reached out to Fusion Alliance for assistance. Solution The first step was to fix the proof of concept (POC) to create a new demo to get the necessary funding for the project. However, even after the initial fixes, BioLink realized they needed to start over. The POC had to be rebuilt including the software, like web pages and portals, but also the hardware and firmware. Our solution was to architect a full-blown IoT solution from scratch. With the initial POC, all of the data was on-premise. But with the new architecture, we moved everything to the cloud. We were able to ensure each patient had a unique identifier and that all data was encrypted, safe, and HIPAA compliant. In addition to re-creating their initial adult brief wearable, we created a wearable device to monitor patient vitals, including body temperature, oxygen levels, and heart rate. Accurate data in real-time Once we had a solution for collecting patient data by wearable devices, we needed to ensure that it was transmitted to the right people at the right time. To do so, we created smart hubs that were spread around the facility to upload data to the cloud in addition to web portals and a mobile application for caregiving staff. The web portals allow caregiving staff to check-in at nursing stations and in offices, and the mobile application is something they can carry around to check on a patient immediately and provide real-time feedback when they are alerted to a patient concern. These executions allow data to be transferred from the cloud to the portals and mobile apps where caregiving staff are notified with an alert of any patient that requires follow-up care. Improving patient care & outcomes Each patient device is outfitted with an NFC chip so once the nursing staff is alerted, they cannot dismiss the alert until they scan the chip on the wearable device and complete the appropriate assessment. If the alert is not dismissed in a timely manner, as identified by the nursing home administration, the alert is escalated to a new caregiver. This ensures that no alert is missed because a caregiver ise attending to another patient or busy, and interventions are completed in a timely manner. Personalized monitoring & care powered by machine learning Now that we have this data from patients, we can use machine learning to gain insights about a patient’s care and needs. Initially, the patient care staff will set thresholds for appropriate vital levels, but as more data is collected, the system will learn and be able to figure out what the normal levels are for that patient specifically and alert accordingly. Data security The POC device was acting as a Bluetooth beacon, so as we evolved and proposed a new solution, security was at the center of the entire project. We needed to ensure that the data transmission was completely secure and HIPAA compliant, but also that the wearable device had significant battery life so that data was accurate and consistent. To ensure a long battery life, we created a wearable and smart hub that doesn’t have any screens and only communicates via Bluetooth. You simply have to put the wearable device into pairing mode and it exchanges the encryption key through the cloud. Then, the encryption key is sent to the smart hub, and the data is all encrypted through the device and decrypted once it is in the cloud. In addition, each wearable device has its own dedicated encryption key so if one wearable is compromised, BioLink can simply address the issue that individual device without other wearables and data being compromised.
This manufacturer needed an analytics solution to develop predictive models and provide meaningful proactive recommendations to their fleets. Challenge Our client, a large-scale automotive components manufacturer, had recently implemented a real-time health feedback solution for fleets of vehicles using their components. The system used IoT engine and sub-system data points to create real-time notifications of issues with the vehicles along with recommendations for correcting those issues. This system was implemented in Azure using Event Hub functionality and custom development. While this solution was successful, it was also limited to only reactively responding to warnings and errors as they occurred. The company needed an analytics solution to create functionality for developing predictive models that provide meaningful proactive recommendations to the fleets. Solution To support the high-level analysis and machine learning required to build predictive models, a large amount of data over an extended time frame was needed. Since the current system only retained data long enough to provide real-time responses, another big data solution was required to capture and retain the IoT data. Additionally, there was a need to integrate manufacturing data with the IoT data to be used in the analytics processes. To do so, we designed and implemented a Data Warehouse to collect and organize the data and provide a data platform for high-end analytics and machine learning. Working alongside the client, we also: Developed an Azure and cloud solution architecture Identified recommended technologies and planned for implementation Designed and developed an integration model to ingest enterprise source data Built the Azure data pipelines to process the near real-time data from devices Integrated data into a curated data model for BI & analytics Demonstrated the BI capabilities to prove the business value of the data Ultimately, the solution uses the existing Event Hub functionality to output the data, Azure Data Factory to manage the data flow, and Snowflake Data Warehouse to store, organize, and integrate the data.
How one real estate company enabled corporate agility to gain valuable and actionable insights through data and analytics. Challenge Most organizations are brimming with data. Information about customers, products, services, and more. All this data can be a treasure trove to companies by allowing them to customize customer experiences and improve internal efficiencies. But, without the right strategy and technology, data can remain nothing more than a digital paperweight, providing little valuable insight for companies to act on. Our client, a large real estate investment firm, wanted more from their data. They were burdened by extensive manual data entry, multiple systems that held different information, and delays with reporting and analytics. They were also dealing with their data storage. Their on-premise infrastructure about to reach capacity, so they knew it was time to act. Like many companies, they found themselves asking if now was the time to move to the cloud. Ultimately, their goal was simple: enable corporate agility by implementing a platform that would ultimately allow them to gain valuable, actionable insights through data and analytics. Now they just needed the right tools and expertise to get there. Solution Cloud Solution We started with an assessment of their existing system and data sources to determine requirements for a data platform that would better provide stakeholders with the insights they need. Our client was looking at different cloud options and wanted to ensure their platform decision would work with their existing systems and allow them to improve their current processes. More than just a new implementation, our client needed the right data strategy in place that would allow them to meet their new business goals. Based on interviews with stakeholders and the results of the platform evaluation, we ultimately recommended a hybrid cloud model that would work with their existing infrastructure, utilizing Azure as a cloud solution and the Snowflake data platform. We created a roadmap that prioritized options for building out their data and technology strategy and a plan for getting from the current state to the updated future state. Implementation & Proof of Concept Armed with the recommendation and roadmap, this investment firm was ready for implementation. We worked with them to: Build out the new data architecture, data model, and ingest processes Identify and integrate new data sources into their analytics environment Create and refine data governance processes and information After the client was able to store and manage their data more effectively, we worked with them to develop dashboards and analytics reporting that would give them the insights they needed to make informed business decisions. Given its analytics and visualization capabilities, automation functions, and ease of customization, we knew that Power BI was the right tool for this client’s analytics needs. Because this real estate investment firm focuses on shopping malls, data from their shoppers was critical. They wanted to be able to track how well their properties were engaging with customers via different social media channels. We engaged with their marketing department to build out a dashboard pulling data from Facebook, Instagram, Chatmeter, marketing email interactions, and more.
Fortune 500 company empowers employees with easy access to the most accurate, up-to-date information. Challenge A Fortune 500 card services company providing loyalty and marketing solutions wanted to improve internal communications and design a virtual workspace for its more than 8,000 employees. The company had envisioned a user-friendly intranet that would allow employees to work together efficiently and securely, but the resulting site didn’t adequately fulfill those needs. The company asked Fusion Alliance to step in to bring the concept to life, and the resulting intranet site delivered on the promises of Microsoft SharePoint, including time and cost savings that were realized through streamlined workflow processes and effective document management. Background In theory, intranet sites help companies share information and improve communications. In reality, many sites don’t deliver and over time become a disorganized navigational maze due to a lack of governance, which further hampers productivity and collaboration. If your employees find it difficult to navigate your intranet and waste time dealing with administrative tasks, you lose productivity and money, putting your business at a disadvantage. Investing in a well-structured, well-thought-out intranet site with built-in governance controls provides you with a competitive advantage. An easy-to-use site yields efficiency, collaboration, and effective workflow. It empowers employees to have instant access to and make decisions based on the latest information, which affects not only productivity, but customer service. In an economy where customer focus is the order of the day, an investment in internal productivity is an investment. Having invested in an intranet that was not sufficiently meeting the workload demands of thousands of users, this organization decided to look for outside expertise to build a site that would be reliable, secure, and do what was originally intended. Companies that experience difficulty with new systems sometimes throw out their entire investment. This client understood the outcomes SharePoint could deliver company-wide and decided to remedy the issues of its out-of-the-box system. Solution Fusion stepped into the picture with a plan in place. We completed an inventory of content in the existing SharePoint site to distinguish current usage patterns and ensure the new templates would accommodate these patterns. We also conducted numerous stakeholder interviews to identify key content items needed for the redesign. The client wanted to build three different site templates, those for teams, projects, and pages. Using Agile and a proprietary project-management methodology called SureSolve, our team designed a new site for each of the templates and created a branded master page consistent with the organization’s style standards. We employed Agile development to create a site that can adapt and evolve with the changing demands of the employees, business, and industry. We designed and created a customized My Documents Web Part, a governance plan and content guide, and a site map and information architecture. Workflows were streamlined and processes put in place so that document and project management were no longer cumbersome. Our team also designed and implemented customizations and other features following Microsoft best practices, with forethought to ensure that those customizations did not hamper the client’s ability to upgrade SharePoint or move to SharePoint Online in the future.
Challenge A Fortune 500 card services provider needed a way to optimize how it retrieves and analyzes data for each of its 135 private-label credit card merchants in order to provide customized loyalty marketing services. After working with Fusion Alliance, the client gained a roadmap that outlined areas for improvement and strategies that would enable the company to cater to the needs of merchants and reach additional clients. The end result was a strategic data management and analytics strategy that would build out a better future. Background The ability to analyze quality data in a timely manner allows loyalty marketing companies to create strategic campaigns targeting new and existing cardholders. As new technology continues to drive the market across industries, it is imperative that loyalty marketing companies implement data-driven solutions to manage the increasing volume of customized client demands. This company’s existing data management and analytics platform supported marketing and financial analytics, but they recognized a need to optimize these systems to align with their expanding business vision and strategy. The organization turned to us for help in developing a strategic data management and analytics strategy and a roadmap to guide the optimization journey. Solution At the time Fusion got involved, the client’s brand partners included 135 different merchants with private-label credit cards. For each merchant, the company provided reports and analytics based on a unique set of requirements. They saw that they needed a strategy to better manage the volume of reports with their current resources and still maintain the level of customization clients were accustomed to, especially as the brand-partner portfolio continued to grow. In order to implement the best solution possible, we completed a comprehensive evaluation to determine current business needs, requirements, and market opportunities. We assessed the technical landscape and reviewed the client’s current strategic plan. Then we worked together to create a data management and analytics strategy and a multi-year roadmap.
Racing the clock: How we helped Blane Canada create a coronavirus dashboard to measure coronavirus economic impact
Blane Canada needed assistance building a dashboard to provide government leaders data needed to evaluate the economic conditions of their communities, and they needed it fast. Challenge During the summer of 2020, government leaders needed to continually evaluate the economic conditions of their communities in response to the COVID-19 pandemic. They needed to make strategic decisions about how to help businesses, which ones should reopen, what part of the workforce should continue to remain remote, how long that should continue, and so forth. But there was a lack of data on which to base these decisions. Illinois-based Blane Canada, Ltd., an economic development services firm, and the volunteer, grassroots BR|E (business retention/expansion) COVID-19 Response Network envisioned a way to quickly provide that data. They created a benchmark survey and follow-up questionnaire to measure the level and severity of the impact and to learn the needs of businesses. The carefully selected questions revolve around the workforce, finances, supply chain, and the future. However, they needed to gather and deliver this benchmark data, and that would be quite a challenge. Getting the right technology in place The COVID-19 Response Network wanted a tool that would allow them to analyze and distribute data on a large scale and free to the public. The group needed a technology partner who would take time to understand the problem, ask the right questions, and build a technology solution against a tight timeline. Eric Canada, CEO of Blane Canada, was confident that Fusion Alliance, his company’s technology partner of two years, would be the right fit. He asked if Fusion could build a tool for economic developers to learn the impact of the virus on their business communities. Within two weeks, we had a solution and dashboards up and running, available to the public. Solution Our team began by evaluating survey platforms to choose the right one for the task. We conducted proof-of-concept testing to see which platforms met all requirements, and then selected a tool. Next, we determined how to standardize and unify data collection. Then we built the survey, sent it out, and enabled other entities to send the survey, as well. That all happened within five days of being approached. After that, the focus was on how to display the results. Our team suggested something similar to the Johns Hopkins coronavirus dashboard, and Canada was on board. A dashboard would allow users to see the story in a visual format and interact with the data. Working against the clock, we built the dashboard and demoed it a few days later to more than 100 organizations in the growing grass-roots volunteer network. Two days after the survey was sent, the data began pouring in, and it was aggregated and put into the analytics toolset. The group continues to send surveys and follow-up monitoring questionnaires all over the nation, and the dashboard is constantly updated as more results come in. Some companies have participated and submitted up to four monitoring surveys, providing more data points. Fusion Alliance is essential to making a difference for our clients, and, more importantly, for economic developers and communities across the country and beyond. Eric Canada, CEO, Blane Canada, Ltd.
Evolving demands require evolving strategies. Learn about how one company transformed their customer experience by focusing on data. Challenge The ability to analyze quality data in a timely manner allows loyalty marketing companies to create strategic campaigns targeting new and existing cardholders. And as new technology continues to drive the market across industries, it is imperative that loyalty marketing companies implement data-driven solutions to manage the increasing volume of client demands. This Fortune 500 card services provider recognized the need to provide customized loyalty marketing services to its 135 private-label credit card merchants, and realized that the only way to do that was to optimize how they retrieve and analyze data. Their existing data management and analytics platform supported marketing and financial analytics, but they needed to optimize these systems to align with its expanding business vision and strategy. Ultimately, they wanted to: Create a strategic data management program to drive data and analytics maturity Optimize teams to enable delivery of customized client solutions more efficiently Establish greater oversight of goals and results from investing in a strategic data management program They knew where they wanted to go, but they needed help with how to get there. Looking for help in developing a data management and analytics strategy, they reached out to us for help. The organization turned to Fusion for help in developing a strategic data management and analytics strategy and a roadmap to guide the optimization journey. Solution At the time we got involved, the client’s brand partners included 135 different merchants with private-label credit cards. For each merchant, the company provided reports and analytics based on a unique set of requirements. They saw that they needed a strategy to better manage the volume of reports with their current resources and still maintain the level of customization clients were accustomed to, especially as the brand-partner portfolio continued to grow. We completed a comprehensive evaluation to determine current business needs, requirements, and market opportunities. Then, we assessed the technical landscape and reviewed the client’s current strategic plan. The result was the creation of a data management and analytics strategy and a multi-year roadmap to help them achieve their goals. With the full support of the client’s executive committee, we began solution delivery, starting with the creation of an Enterprise Data Governance Council, appointed to oversee the Data Management and Analytics Program. The process was accelerated by applying Fusion’s comprehensive Catalyst Strategic Data Management framework. Through this approach, we completed the data management and analytics strategy and roadmap on a fast-paced schedule, allowing the client to more effectively guide development efforts needed to deliver analytic enablement in support of business objectives.
Life-saving technology for long-term care: How one healthcare leader tackled problems for most vulnerable patients
Need to increase brand loyalty? Solve customer challenges using technology. See how a market leader in diabetes care turned the worst pain points in long-term care facilities into an opportunity to save lives and money. Challenge The healthcare technology solutions market is saturated with companies vying to stay relevant by delivering game-changing ideas. A leader in diabetes-care solutions had an idea regarding managing diabetes patients in long-term care settings, and they asked Fusion to be the technology partner to bring it to life. The stakes are high for diabetics in long-term care settings. Any number of seemingly simple-to-avoid missteps may lead to a snowball effect of negative outcomes, ranging from worsened health to patient deaths, not to mention the financial impact on caregivers and facilities. Identified problem for customers After significant research, our client identified a frequent, expensive, and potentially fatal problem: unreviewed test results. It is common practice for diabetes patients in long-term care facilities to have standing orders for blood glucose monitoring every few hours. But if the physician did not review the results before the next test was conducted, the insurance company often would not reimburse the facility for the cost of the initial test. And because test results weren’t being reviewed in a timely manner, caregivers often did not intervene as quickly as necessary during diabetic episodes, resulting in adverse outcomes such as amputation, diabetic comas, and subsequently higher payments for insurance companies. Envisioned solution to drive customer loyalty The client wanted to develop a patient-event notification system that would automatically upload the standard diabetes test results and securely send them to the provider. If the provider did not respond within a set amount of time, the system would automatically follow an escalation protocol, sending the results to the next level provider, etc. The process would continue to escalate until the issue was resolved. If the results revealed a diabetic event that needed immediate attention, the provider would be able to create the necessary care plan to minimize negative outcomes. And if the company could create a notification system that worked specifically with its own diabetes-care tools, it would strengthen customer loyalty to its specific product and brand. Converted an idea into a technology solution The challenges of building the technology solution would be compounded by a multitude of security and privacy regulations. The company chose to work with our team because they needed a partner with deep experience in regulated industries, broadly integrated technical proficiency, a clear understanding of security and usability issues, and the ability to meaningfully engage and wrap solutions around business processes. We would need to navigate complex relationships with patients and providers, ensuring the solution would comply with industry regulations and patient confidentiality standards. Our team was excited to get started. Solution Our Fusion team helped the client in the journey to develop, pilot, and grow the diabetic-event notification system: from marketing and trade-show work to a new, high-security data center, custom application, and an exceedingly well-received, web-based caregiver interface. To implement an event notification system that could securely submit patient information and follow the requested escalation protocols, the company needed to build a new interface. We built a web-based .NET portal that allowed the results to be transferred from this company’s blood glucose meters to the provider for timely review. Our team remained focused on patient privacy and industry regulations and created a high-security data center to ensure information security throughout the portal. Usability was another important piece of the solution because when providers can’t or won’t embrace the technology, a great idea isn’t worth much. We implemented features to enable unified messaging to allow providers to receive information in a customized way that was convenient for them. The pilot program successfully determined that the new event-notification system would substantially assist in improving patient care, patient satisfaction, and quality of life, while increasing reimbursement rates and lowering operating costs for long-term care facilities. The company earned immediate success and brand loyalty due to its absolute focus on the customer.
A regional bank wanted to reduce attrition to retain millions of dollars that flee when customers close their accounts. Could machine learning predict which checking accounts are likely to close so that they could change the outcome? Challenge With the banking industry in flux, disruptive competitors grabbing market share, and customers raising the bar on what kind of experiences they expect, banks must find ways to attract and retain customers on a level never known before. Today’s banking customers grow impatient more quickly than they have in the past, and if they are unhappy with an experience, their loyalty is fleeting. According to CDC/NCHS National Vital Statistics, each year, most banks lose about 10% of their account deposits due to customers closing their accounts. Of that segment, 50% leave because they are dissatisfied with the bank’s service, fees, rates, products, or lack of convenience. The other 50% leave due to events that the bank cannot control, such as death, divorce, or displacement. For example, if a bank handles $700 million to a billion in deposits annually, nearly $100 million in capital walks out the door each year due to customers closing their accounts. A regional bank saw an opportunity to reduce attrition in this area. This long-time client wanted to be able to predict which checking accounts were likely to close within the next 90 days so they could take action to retain the customer. They knew machine learning could provide them with that data, but they had never leveraged it before. Machine learning is a data science technique that analyzes massive quantities of data, especially historical, to discover trends and insights and rapidly predict future behaviors and outcomes. The technique lets the data learn from itself, free of human bias or the need for explicit instruction. Traditional analytics tools don’t have the capability to rapidly uncover patterns when there are billions of data points to be analyzed, nor can humans identify patterns in such large quantities of data, not to mention in real-time. This is the value of machine learning in enabling your data to be a market differentiator, and that’s why this bank wanted to explore a proof of concept through a Fusion Alliance Machine Learning Jumpstart. ON-DEMAND BANKING WEBINAR: Learn how to turn data into insights that drive cross-sell revenue Solution Our team had more than one goal when we began this jumpstart. Foremost, we wanted to develop a machine learning model for this bank to reduce deposit attrition. We also wanted to support the education of the client’s team on the key elements of machine learning, such as the process for training models and the process for generating predictions. This would occur organically as we worked side by side and guided them through the journey. The bank was interested in machine learning because it decreases the risk and expense of traditional analytics by allowing the data to speak for itself. They additionally wanted to understand the key metrics for evaluating machine learning models. Innovation Use case identification. Prior to beginning our technical work, we explored a variety of use cases in a workshop with the bank’s business and technical stakeholders. Our team rated the potential use cases on different criteria such as how complex it would be to develop the model, what data was available, and the value impact to the bank. Together we agreed upon the deposit attrition proof of concept, deciding it would drive maximum predictive value with minimal risk. That would be the basis for the succeeding steps. Data processing. We performed an inventory of existing data, sourced the data, cleaned it, and loaded it in the target on-premises environment where the models would be developed. We provided the option of loading the models in the cloud to enable additional ML models and more complex computations. Machine learning model development. Within three weeks, we began to engineer the machine learning models, choosing the subset of data most relevant to the question, “Which checking accounts are likely to close in the next 90 days?” We selected the machine learning algorithms, then trained and tuned the model. We then met with the bank’s stakeholders to present metrics to measure the model’s success, focusing on KPIs. Model insights integration. In this step, accounts at high risk of closing are referred to the bank’s retention team, and Fusion helps expose these insights so that the bank can take action. The end-to-end process to generate daily predictions using real-time data can be accomplished in less than an hour using the bank’s current infrastructure. As a next step, optimization can occur. In this phase, model efficacy is captured, and the bank can optimize and expand the use case or use the model as a template that can be modified for one of the other vetted use cases. The entire proof of concept took eight weeks, and the bank is now in possession of machine learning models that can be implemented in marketing campaigns in the next phase.
A large institutional bank solved a recurring issue of more accurately predicting cash reserves. See how machine learning made it possible. Challenge Managing bank and credit union reserve cash is a complex exercise: manage it too tightly and your institution may be subject to high-interest Federal Reserve borrowing fees. Manage it too loosely and your firm may lose out on substantial interest revenue from parked cash. Our client, a wholesale financial services provider to hundreds of credit unions in the U.S., traditionally kept a large volume of cash in reserve to account for member credit union activity. Since these credit unions conduct business autonomously, the organization was constantly challenged to predict members’ cash reserves without any direct control or visibility. It was time to explore opportunities to apply advanced analytics to predict member activity and drive better returns on reserve cash, and that’s what led to their partnership with the Fusion Alliance team. ON-DEMAND BANKING WEBINAR: Learn how to turn data into insights that drive cross-sell revenue Solution While this client was unable to directly influence credit union spending and borrowing, they possessed one critical element — decades of financial transaction data to support the cash reserve engagement. Company leaders understood there were patterns in the member credit union data based on calendar milestones (payroll activity, mortgage pay activity, etc.) but needed help identifying these regularities in the noise across hundreds of credit unions and billions in cash. This project explored 18 years of historical cash data to predict the next 60 business days of member activity, in aggregate and by cash account. The initiative additionally provided a discrete view for the investment desk to simulate cash and borrowing needs to effectively partner with finance. Ultimately, the machine learning algorithm that needed to be selected would need to favor recency of history but still account for the entire body of transactions. To accomplish this, our team: Landed and cleaned data in the company’s Azure Cloud Accumulated success metrics on a variety of algorithms to achieve the desired liquidity aims for the organization Ultimately, selected a long short-term memory recurrent neural network Once we achieved the desired metrics for cash management, our team: Developed an analytical website solution that: Allowed the company’s finance team to feed new data Exposed long-term analytics with the liquidity for the investment team to effectively manage bank cash in the big picture Secured the environment according to bank best practices Developed a weekly retraining process to keep LSTM models current Integrated the solution with a machine learning web service hosted in Azure
Data management in banking poses unique challenges. You’re dealing with vast amounts of sensitive information, rigid regulations, and security issues, all of which can complicate the process of actually managing and using the data you collect. Given our long experience with data management in financial services, we jumped at the chance to help a regional bank streamline their data strategies. After helping them transform their operations through an enterprise data management program, the client saw a staggering 1,054% ROI over three years. Challenge A regional bank’s need to prevent and reduce credit losses from defaulted commercial loans was symptomatic of a greater challenge. The bank needed a data management program that could help it more effectively manage different aspects of the business. Read on to learn how a new finance data strategy helped our client triumph over the core challenges of 1) meeting stringent regulatory demands for more robust reporting and 2) dealing with issues surrounding its data and data access. A common challenge in banking Each year, banks approve billions of dollars in commercial loans. Throughout the approval process, documents are signed and covenants are created to ensure that funds will be repaid. Funds not repaid within the outlined term can result in higher capital requirements on the institution and, ultimately, credit losses for the bank. Most often, the only indicator that a loan has gone bad is when payments become delinquent, which is too late. Our client wanted to analyze such scenarios well in advance to prevent payment default. And that’s what led to the need for an improved bank data management program. Solution Our team’s assessment revealed that this bank’s ability to quickly uncover and manage credit loss was constrained by a lack of consistent, quality data, and by static reporting and manual processes. In addition to regulation issues, other issues to resolve included: Incomplete and inconsistent data A desire to have more time to analyze data before monthly, quarterly, and yearly reporting An inability to see the “story” behind the data An inability to interact with the data through visualization tools Summary of deliverables An Enterprise Data Management framework that included culture, people, process, and technology change management The enablement of a new data leader, i.e., a chief data officer Numerous executive dashboards: Status dashboards – accrual, AQR status, charge-off and recovery, delinquency, and others Trend dashboards – commercial portfolio, retail and mortgage portfolio, charge offs, large dollar exposure, etc. Operational dashboards – delinquencies and maturities Alert dashboards – accruals, loan structure alerts, AQR alerts, etc.
Challenge One of the nation’s largest credit and debit card transaction processing companies saw the industry headed toward commoditization and omnichannel processes. They realized that there was significant value for the merchants they process transactions for, in the dark data of every card transaction they processed, whether for authorization or decline and settlement. However, this data, together with merchant, financial institution, and consumer information, was highly proprietary. It included PII that could not be used for purposes other than those defined to authorize and settle. The processor was torn, knowing that they needed to update with new processes and technologies, but not wanting to risk the security and efficiency of their services. They were also looking to expand to other channels and create competitive, new product lines — something that would require data that had previously been unavailable to them. Company leaders knew they’d have to rethink the very foundation of how they do business, and that included taking a hard look at both their capabilities and gaps. They realized a new data transformation strategy based on credit card data analytics would be the key to moving quickly and keeping ahead of trends. This strategy would allow them to evaluate their existing technologies against newer, more nimble ones and manage risk while maintaining current service levels. This led them to turn to Fusion for help creating a digital banking strategy roadmap. Fusion would need to build an environment that could process and analyze billions of data transactions, integrate with other internal and external datasets, and make headway in understanding consumer behavior. The environment also needed to preserve the privacy, integrity, and quality of data to ensure any obtained analytics and insights were valid. Solution As a first step, we identified obstacles that could potentially prevent us from creating an environment for managing and interpreting data. Next, we defined a framework for governance and controls that would ensure all credit card transaction data and resulting analytics are kept confidential. During this discovery process we also evaluated the processor’s current technology landscape, providing recommendations on what could be repurposed and what should be replaced. This auditing process set the stage for building a digital banking strategy roadmap, where we identified and prioritized business opportunities and drivers. We defined the roadmap by outlining the steps and initiatives that would need to be executed over the next several years, including: Defining the architecture Selecting the technology Implementing data governance Obfuscating the data Creating the desired data management platform capabilities We also identified change agents and thought leaders that would propel the organization to success with their new platform, and implemented processes to create alignment between business leaders and IT.
PFC partnered with Fusion to transition from depending on legacy systems and manual processes to embracing new technologies and the cloud — transforming the way they operate and deliver customer solutions. Challenge Midwestern financial services firm Primary Financial Company (PFC) was more than ready for change. The company was operating successfully on three legacy technology platforms, but peak activity periods severely strained technology, employees, and resources. As a broker of millions of dollars in CDs daily, PFC needed a flexible technology solution that would adapt to future needs and align with business objectives. The company turned to Fusion Alliance for help. The tremendous success of that first venture led to three more initiatives, including data and machine learning work, to meet the evolving challenges and needs of the marketplace. Solution The relationship between PFC and Fusion Alliance is a story of tremendous ROI and a partnership of trust. PFC constantly seeks to improve the investor experience, and early on, we developed a collaborative, single-minded approach to meet PFC’s vision. Over the years, PFC has trusted our expertise to lead the company into initiatives and functionality whose value may not have appeared obvious at the outset, but which paid off immensely once implemented. First initiative: overhaul existing systems and streamline processes PFC’s work — managing an investment trading platform through which institutional vendors invest thousands to millions of dollars in federally insured CDs — is complex, especially in a highly regulated environment. All that PFC’s customer investors want is a reliable platform that provides access to new products and tools so they can expeditiously trade in real time. When we first began working with PFC, the primary challenge was to find a technology solution that would adapt to future needs. PFC was running operations using three disparate legacy systems. IT spent the majority of its time maintaining the systems and an antiquated web portal rather than adding functionality. Manual processes slowed productivity. Costs related to on-premises equipment, maintenance, security, and off-site data recovery were considerable. Starting on a clean slate gave PFC an opportunity to reevaluate processes, implement innovative ways to serve investors, and provide tools to enable insightful decision-making. Our Fusion team developed a single platform with a more functional web interface to replace the legacy systems. That streamlined and automated numerous manual processes, instantly increasing productivity. We convinced PFC that their business and technology objectives would be best met by migrating to the cloud. PFC took the leap, becoming one of the first financial firms in the nation to move to the cloud. The cloud eliminated on-premises servers, equipment, maintenance, and security. Off-site disaster recovery was replaced with cloud redundancy. An over-abundance of hard-copy documents were replaced with scanned PDFs stored in the cloud. All this freed up office space and created substantial savings. All told, the body of this work with PFC: Removed expensive, geo-redundant datacenters Enhanced efficiency, scalability, and reliability Created significant process improvements Boosted productivity Enriched client and employee experiences Increased brand confidence Widened market reach Mark Solomon, PFC President and CEO at the time, was very pleased at the conclusion of this work, which led to further partnering. New product lines and more When PFC later wanted to make adjustments to accommodate new product lines, it was a no-brainer to partner with Fusion. We added functionality, the necessary coding, and an easily-extensible framework so that PFC could push out similar new product offerings on their own, decreasing dependence on Fusion. Thinking of future needs, we advised PFC to implement Microsoft Power BI’s visual analytics to gain interactive reporting capabilities that could be used to make informed decisions. Interactive visualizations are now invaluable to PFC. We also recommended adding a single-sign-on option, more robust security safeguards, and extended data capabilities. At the time, PFC had not considered such capabilities but trusted Fusion’s expertise. A couple years later when such functionality was in demand, PFC was ahead of the game. Applying machine learning to improve sales In another endeavor, Fusion and PFC are using machine learning to improve sales targeting to forecast potential buyers and sellers of CDs. We will be able to ascertain with over 80% accuracy and 70% precision the likelihood of a particular investor buying a given investment. Analyzing user data to drive sales Today, the Fusion team is also helping PFC look at who does and does not purchase its products when interacting with their site. By analyzing the data, PFC can draw insights and identify opportunities to increase sales. Summary PFC is constantly thinking of the next step and partners with Fusion to help make it happen. For example, some recent developments included implementing push notifications so that PFC clients can receive alerts by text or email. PFC’s ongoing commitment to investor satisfaction and willingness to embrace cutting-edge technology creates a solid foundation for the company’s continual success. Our clients and employees have a greatly improved experience, and Fusion continues to be an invaluable partner to our company. (Former) PFC President and CEO, Mark Solomon
This bank wondered if they could use their data to understand their ideal customer and identify opportunities. Here's how we helped them use machine learning to achieve their goals. Challenge A growing bank was relying on word-of-mouth and referrals to acquire new commercial customers. Decisions on the best target customers were based primarily on a banker’s personal experience, and prospecting was somewhat random rather than based on strategy, making the entire sales pipeline unpredictable and difficult to scale. In addition, the utilization of the CRM was inconsistent and incomplete across the organization, and data was siloed and decentralized. All of this limited the effectiveness of using data insights to support sales and marketing. The bank’s leadership was seeking answers to mission-critical questions that could allow them to grow market share in the region. These questions included: Who is our ideal customer? What characteristics, behavior, and experience with the bank make a customer ideal? How can we use what we know about our ideal customers to identify the best prospects? Can we improve the number of prospects that become qualified sales opportunities? How many ideal target customers are in our region? Can we develop an outbound marketing program for our prospects and effectively use CRM to manage it? Is there a way to prioritize which prospects to target? How can we improve the quality of data used for sales and marketing? How can we create a true segmented marketing strategy? Bank leaders wanted to change the paradigm used by the bankers to influence the focus of sales and prospecting efforts, so they brought Fusion in to assess the situation. ON-DEMAND BANKING WEBINAR: Learn how to turn data into insights that drive cross-sell revenue Solution Fusion presented a strategy and machine learning approach to build a solution that would allow them to identify the ideal customer using data, target prospects based on the understanding of an ideal customer, and identify new opportunities in the market. Machine learning is a data science technique that uses the breadth and depth of data, especially historical, to rapidly predict or forecast future outcomes, behaviors, and trends. The technique lets the data learn from itself without human bias, preconceived notions, or the need for explicit instructions — and because the quantity of data can be massive, machine learning can identify patterns that cannot be uncovered or recognized by humans, especially when there is a need for real-time decision making. There’s much more to this story than just defining a problem and throwing machine learning at it. The real story is about the process you take to find your solution. It is an iterative process where you learn from the insights gained and use them to make continuous improvements. Our approach to identifying the ideal customer and improving prospecting for this client consisted of five key steps: 1. Defining the characteristics that make up an ideal customer. To gather different perspectives within the organization (finance, marketing, sales, etc.), we spoke individually with key business stakeholders and asked what they believed were the characteristics of the ideal customer. Then we brought the stakeholders together in a workshop to create alignment and clarity about which attributes they collectively would identify with the ideal customer. Next, we brainstormed and selected high-value uses cases for the machine learning models. 2. Understanding their data, its viability, and its readiness to support the “ask.” Machine learning initiatives are only as successful as the quality of the data, so profiling the bank’s existing data was a necessary step. We identified all available data and performed an analysis to assess data quality and completeness to support the defined objectives. Identifying where to find the best source of the data was an essential part of this exercise. Analysis at this stage can reveal where remediation must occur to either improve any data deemed to be of poor quality or fulfill gaps for essential data to best support the objectives. After the remediation of the data elements has been performed, there is a solid foundation upon which to leverage machine learning. 3. Developing a machine learning model that leverages the significant characteristics for use against prospects or existing customers. In this step, we allowed the customer data to speak for itself and identify characteristics of an ideal customer. This involved: identifying data elements that should be input for the machine learning model based on the data profiling provisioning a cloud environment and developed data ingestion defining and developing machine learning predictive models that supported the defined use cases executing the model against real data and assimilating the output to graphically show the customer segmentation 4. Finalizing the ideal customer definition and refining the model. We used stakeholder inputs, data profiling outputs, and machine learning to let data and actual outcomes influence the definition of the ideal customer. We explained to stakeholders what the model said was the ideal customer compared to what they said. This knowledge enabled a discussion that led to alignment about a final definition. Then we agreed on the final criteria/attributes and how they would be used to align with specific prospecting initiatives. Ultimately, the definition of an ideal customer is based on the context of your objective and, therefore, will result in multiple profiles that align with those objectives. Next, we built the machine learning models that would score (or rank) customer and prospect lists against the ideal customer model. Then we operationalized the model for use in marketing campaign processes. 5. Running the model against lists of potential or existing customers for the purpose of acquiring new customers and business. In this stage, we developed the prospect target list and scored the prospects. When ready, the company can execute the marketing plan based on the prospects. Over time, there is an expectation that new models, based on different features, are developed to align with different sales and marketing objectives.
How does a financial services firm improve sales targeting to predict its clients' desires to invest? Machine learning was the answer for PFC. Find out why. Challenge Long-time client Primary Financial Company (PFC) manages an investment program for institutional investors to invest substantial funds in federally insured CDs. The company: Monitors, tracks, collects, and disburses principal and interest on nearly 40,000 CDs Manages over $7 billion in assets Supports relationships with 5,000 financial institutions and institutional investors PFC wanted to improve sales targeting to predict CD issuers’ funding needs and institutions’ desires to invest. It partnered with Fusion on a pivotal initiative to explore how advanced analytics/machine learning could drive data-driven, predictive outcomes. ON-DEMAND BANKING WEBINAR: Learn how to turn data into insights that drive cross-sell revenue Solution Organizations with the expertise to leverage machine learning will significantly widen the gap between themselves and competitors who aren’t able to move beyond traditional analytics tools. Fusion’s data science and application development teams work together to unlock business insights. We allow our clients’ data to tell the story without introducing bias to the underlying predictions. This hybrid approach allows us to apply agility in the process to rapidly operationalize insights into tools and platforms you use every day. PFC understood the value proposition of machine learning and partnered with Fusion to explore the following machine learning models: Identify best issuers for sales solicitation, including former, current, and prospective issuers Provide rate guidance to investors and rate/term guidance for CD issuers Target investors by likelihood of close The process included four main steps: data acquisition, transformation, model development, and predictive analytics. All relevant private and public data sources were identified and acquired to gain more information on current and prospective customers. PFC and Fusion then collaborated to determine meaningful and available factors. Next, the data was transformed so these factors would be consistent and accurate. With a solid foundation, Fusion developed machine learning models that would learn and identify patterns, then recognize those patterns when seen again to apply lessons to predict outcomes. The first phase was highly successful, producing the following benefits, with more to come in the next phase. Equipped PFC sales team with a qualified list of issuers to target based on previous profiles of customers. This enables efficient use of PFC’s finite sales and marketing budget Provided a deeper understanding of where CD issuers need to price their instruments and the rate at which investors are likely to purchase. Understanding this spread allows PFC to potentially achieve larger-scale trading business Provided a simple way for PFC’s co-brokers to market the right product at the right time to the portfolio of investors
Learn how an enterprise organization found a sustainable and cost-effective way to stay ahead of technology changes and develop market-driven solutions. Challenge McGraw-Hill Education (MHE), one of the Big Three providers in education content, recognized years ago the significant opportunity presented by changing technology in the classroom. Educators were under pressure to provide individual, personalized learning to every student. If MHE could deliver insightful, technology-based education products and services, they could demonstrate their commitment to providing customers with state-of-the-art, adaptive learning systems. In doing so, they would redefine themselves as a large corporation with the capacity to innovate in a changing market. They knew partnering with a company experienced in developing technology solutions was key, and that’s how the long and collaborative relationship between MHE and our Fusion team began. Solution When our Fusion team first began working with MHE, their main challenge was the tremendous demands that producing innovations would place on the organization’s mission to deliver high-quality content and materials – especially since their primary expertise when we started working together was print-based, not digital. MHE looked to us for thought leadership and a strategic vision, and to leverage our application developers, scrum masters/project managers, quality assurance resources, application technical leaders, and technical product managers to help execute the vision. Different development teams were organized to deliver numerous large-scale educational products that continue to be used in classrooms today. For example, we led the architecture and business-case discussion for deploying application services in the cloud, which serves more than four million students. As the success of the partnership continued and specific products were delivered, MHE expanded this practice across their digital platform group. Another benefit emerged serendipitously. When MHE originally partnered with Fusion in 2010, Fusion was a thought leader in the Agile space. As such, we introduced Agile development principles and the Scrum development framework to MHE’s team, who began to experience firsthand the difference an Agile environment can make. Fusion’s Agile framework became widely adopted within MHE.
Do you have what you need to create the right digital strategy and effectively reach your customers? A digital analytics assessment helped Seven Corners pave a new path. Challenge Seven Corners, a comprehensive travel insurance company, was ready to change their overall digital strategy. They wanted to increase their focus on digital analytics but quickly realized that their foundational data was not consistent or accurate. Multiple data sources, including lead-generation data, agencies, and data analytics, told different stories. The company needed to establish an authoritative source of truth by which they could measure all other tactics. This was one of the most valuable projects for us in 2018. It served as a springboard for us to drive significant improvements in return on marketing investment. Greg Jung, VP of Marketing, Seven Corners Solution After identifying that inaccurate and inconsistent data was being collected in Google Analytics (GA) via Google Tag Manager (GTM), our Fusion team conducted a comprehensive audit and assessment of Seven Corners’ digital analytics implementation and evaluated it across the primary interactions on the website. We provided Seven Corners with: 40 actionable and prioritized recommendations to significantly improve the accuracy and quality of the data being collected Detailed list of issues in GA and GTM that need to be corrected Suggestions for the future use of features within GA and other tools that will enhance digital analytics capabilities Best-practices recommendations Suggestions of improvements to implement after resolving prioritized issues
See how Donatos used machine learning to retain 45% of its potential customers at risk of leaving. Challenge Family-owned Donatos Pizza needed a new recipe to differentiate itself in an overcrowded market, but this one involved machine learning. This is the story of how Donatos used this advanced analytics tool to solve a problem and achieve their goal of retaining more new customers. Solution Donatos was sitting on a wealth of customer data, including demographic information, what they ordered, how they paid, time of order, time promised, cost of purchase, complaints, and much more. This abundance of data made it easier for Donatos and our Fusion team to explore a machine learning model in selected stores across the country. The pilot program also included a control group for comparison purposes. Creating and implementing a machine learning model involved: Putting Donatos’ extensive data on a cloud platform to accelerate the process Loading and landing the data to let the machine learning algorithms do their job Evaluating and selecting the Donatos data most capable of providing accurate answers. (This step included pulling in the source data, aggregating it, and then filtering it for aberrations, such as orders not expected to repeat, like an out-of-town business person.) Assessing the quality and quantity of the data Cleansing the data to use as a training set, which was used to identify which machine learning algorithm would produce the most accurate model to predict who would stay or leave With the foundation set, each day we’d run the previous day’s sales in each of the pilot stores against this model to produce a list of customers who were highly likely to leave. Store managers then took action to get these identified customers to return. Though it was a short trial, the results were impressive.
Ready to talk?
Let us know how we can help you out, and one of our experts will be in touch right away.