Aunalytics Acquires Naveego to Expand Capabilities of its End-to-End Cloud-Native Data Platform to Enable True Digital Transformation for Customers

Naveego Data Accuracy Platform Provides Comprehensive Data Integration, Data Quality, Data Accuracy and Data Governance for Enterprises to Capitalize on Data Assets for Competitive Advantage

South Bend, IN (February 22, 2021) - Aunalytics, a leading data platform company, delivering Insights-as-a-Service for enterprise businesses today announced the acquisition of Naveego, an emerging leader of cloud-native data integration solutions. The acquisition combines the Naveego® Complete Data Accuracy Platform  with Aunalytics’ AunsightTM Data Platform to enable the development of powerful analytic databases and machine learning algorithms for customers. 

Data continues to explode at an alarming rate and is continuously changing due to the myriad of data sources in the form of artificial intelligence (AI), machine learning (ML), the Internet of Things (IoT), mobile devices and other sources outside of traditional data centers. Too often, organizations ignore the exorbitant costs and compliance risks associated with maintaining bad data. According to a Harvard study, 47 percent of newly created records have some sort of quality issue. Other reports indicate that up to 90 percent of a data analyst’s time is wasted on finding and wrangling data before it can be explored and used for analysis purposes.   

Aunalytics’ Aunsight Data Platform addresses this data accuracy dilemma with the introduction of Naveego into its portfolio of analytics, AI and ML capabilities. The Naveego data accuracy offering provides an end-to-end cloud-native platform that delivers seamless data integration, data quality, data accuracy, Golden-Record-as-a-ServiceTM and data governance to make real-time business decisions for customers across financial services, healthcare, insurance and manufacturing industries.

Aunalytics will continue to innovate advanced analytics, machine learning and AI solutions including the company’s newest DaybreakTM offering for financial services. Unlike other “one-size-fits-all” technology solutions, Daybreak was designed exclusively for banks and credit unions with industry specific financial industry intelligence and AI built into the platform. Daybreak seamlessly converts rich, transactional data for end-users into actionable, intelligent data insights to answer customers most important business and IT questions.  

“I’m extremely excited to be leading this next chapter of innovation and growth for Aunalytics and to provide our customers with a new era of advanced analytics software and technology service coupled with Naveego’s data accuracy platform,” said Tracy Graham, CEO, Aunalytics. “Now enterprises have the assurance of data they can trust along with actionable analytics to make the most accurate decisions for their businesses to increase customer satisfaction and shareholder value.”

Tweet this: .@Aunalytics Acquires Naveego to Expand Capabilities of its End-to-End Cloud-Native Data Platform to Enable True Digital Transformation for Customers #DataPlatform #DataAnalytics #DataIntegration #DataAccuracy #ArtificialIntelligence #AI #MasterDataManagement #MDM #DataScientist #MachineLearning #ML

About Aunalytics

Aunalytics is a data platform company delivering answers for your business. Aunalytics provides Insights-as-a-Service to answer enterprise and midsized companies’ most important IT and business questions. The Aunalytics® cloud-native data platform is built for universal data access, advanced analytics and AI while unifying disparate data silos into a single golden record of accurate, actionable business information. Its DaybreakTM industry intelligent data mart combined with the power of the Aunalytics data platform provides industry-specific data models with built-in queries and AI to ensure access to timely, accurate data and answers to critical business and IT questions. Through its side-by-side digital transformation model, Aunalytics provides on-demand scalable access to technology, data science, and AI experts to seamlessly transform customers businesses. To learn more contact us at 1-855-799-DATA or visit Aunalytics at http://www.aunalytics.com  or on Twitter and LinkedIn.

PR Contact: 

Sabrina Sanchez
The Ventana Group for Aunalytics
(925) 785-3014
sabrina@theventanagroup.com


Aunalytics named one of the 2021 Best Tech Startups in South Bend, IN

The Tech Tribune Best Tech Startups 2021

Aunalytics is named one of the 2021 Best Tech Startups in South Bend, IN

Aunalytics has been named one of South Bend’s Best Tech Startups for 2021 by The Tech Tribune. We are honored to have received this award the past four years. Consideration is based off revenue potential, leadership, brand traction, and the competitive landscape.

About Aunalytics

Aunalytics delivers insights as a service to answer enterprise and midsized companies’ most important IT and business questions. The Aunalytics cloud-native data platform is built for universal data access, powerful analytics, and AI. With the data platform deployed and managed securely, Aunalytics reliably turns disparate data silos into a golden record of accurate business information and uses its powerful analytics to provide answers.

Its Daybreak industry intelligent data mart combines the power of the data platform with industry-specific data models to provide access to timely, accurate data, and answer critical business and IT questions with advanced built-in queries and AI.

Through its side-by-side digital transformation model, Aunalytics provides on-demand scalable access to technology, data science, and AI experts to help transform its partners’ businesses.

The complete list of honorees is featured on The Tech Tribune website.


Daybreak Analytic Database

Daybreak: A Foundation for Advanced Analytics, Machine Learning, and AI

Financial institutions have no shortage of data, and most know that advanced analytics, machine learning, and artificial intelligence (AI) are key technologies that must be utilized in order to stay relevant in the increasingly competitive banking landscape. Analytics is a key component of any digital transformation initiative, with the end goal of providing a superior customer experience. This digital transformation, however, is more than simply digitizing legacy systems and accommodating online/mobile banking. In order to effectively achieve digital transformation, you must be in a position to capitalize one of your greatest competitive assets—your data.

However, getting to successful data analytics and insights comes with its own unique challenges and requirements. An initial challenge concerns building the appropriate technical foundation. Actionable BI and advanced analytics require a modern specialized data infrastructure capable of storing and processing a large magnitude of transactional data in fractions of a second. Furthermore, many financial institutions struggle not only with technical execution, but also lack personnel skillsets required to manage an end-to-end analytics pipeline—from infrastructure to automated insights delivery.

In this article, we examine some of the most impactful applications of advanced analytics, machine learning, and AI for banks and credit unions, and explain how Daybreak for Financial Services solves many of these challenges by providing the ideal foundation for all of your immediate and future analytics initiatives.

Machine Learning and Artificial Intelligence in the Financial Industry

Data analysis provides a wide range of applications that can ultimately increase revenue, decrease expenses, increase efficiency, and improve the customer experience. Here are just a few examples of how data can be utilized within the financial services industry:

  1. Inform decision-making through business intelligence and self-service analytics:

While banks and credit unions collect a wide variety of data, traditionally, it has not always been easy to access or query this data, which makes uncovering the desired answers and insights difficult and time-consuming. With the proper analytics foundation, employees across the organization can begin to answer questions that directly influence both day-to-day and long-term decision-making.

For example, a data-informed employee could make a determination on where to open a new branch based on where most transactions are taking place currently, or filter customers by home address. They could also determine how to staff a branch appropriately by looking at the times of day that typically have the most customer activity, and trends related to that activity type.

  1. Improve collection and recovery rates on loans:

By implementing pattern recognition, risk and collection departments can identify and efficiently target the most at-risk loans. Loan departments could also proactively reach out to holders of at-risk loans to discuss refinancing options that would improve the borrower’s ability to pay and decrease the risk of default.

  1. Improve efficiency and effectiveness of marketing campaigns:

Banks and credit unions can create data-driven marketing program to offer personalized services, next-best products and improve customer onboarding, by knowing which customers to reach out to at the right time. Data-driven marketing allows financial institutions to be more efficient with their marketing dollars and track campaign outcomes better.

  1. Increased fraud detection abilities

Unfortunately, fraud has become quite common in the financial services industry, and banks and credit unions are investing in new technologies to fight it. Artificial intelligence can be used to detect triggers that indicate fraud in transactional data. This gives institutions the ability to proactively alert customers of suspected fraudulent activities on their accounts to prevent further loss.

These applications of machine learning and AI simply scratch the surface of what outcomes can be achieved by utilizing data, but they are not always easy to implement. Before a financial institution can embark on any advanced analytics project, they must first establish the appropriate foundational analytics infrastructure.

Daybreak is a Foundational Element for Analytics

There are many applications for analytics within the financial services industry, but the ability to utilize machine learning, AI, or even basic business intelligence is limited by data availability and infrastructure. One of biggest challenges to the achievement of advanced analytics initiatives is collecting and aggregating data across multiple disparate sources, including core data. In order to make truly proactive decisions based on data, these sources need to be updated regularly, which is a challenge unto itself.

Additionally, this data needs to be aggregated on an infrastructure built for analytics. For example, a banking core system is built to record large amounts of transactions and is designed to be a system of record. But it is not the optimal type of database structure for analytics.

To solve these challenges, Aunalytics has developed Daybreak, an industry-intelligent data mart built specifically for banks and credit unions to access and take action on the right data at the right time. Daybreak includes all the infrastructure components necessary for analytics, providing financial institutions with an up-to-date, aggregated view of their data that is ready for analysis. It offers users easy-to-use, intuitive analysis tools for people of all experience levels—industry-specific pre-built queries, the Data Explorer guided query tool, or the more advanced SQL Builder. Daybreak also provides easy access to up-to-date, accurate data for more advanced analytics through other modeling and data science tools.

Once this infrastructure is in place, providing the latest, analytics-ready data, the organization’s focus can shift to implementing a variety of analytics solutions, such as advanced KPIs, predictive analytics, targeted marketing, and fraud detection.

Daybreak Uses AI to Enhance Data for Analysis

In addition to providing the foundational infrastructure for analytics, Daybreak also utilizes AI to ensure the data itself is both accurate and ready for analysis. Banks and credit unions collect large amounts of data, both structured and unstructured. Unfortunately, unstructured data is difficult to integrate and analyze. Daybreak uses industry intelligence and AI to convert this unstructured data into a structured tabular format, familiar to analysts. To ensure accuracy, Daybreak utilizes AI to perform quality checks to detect anomalies as data is added or updated.

This industry intelligence also allows Daybreak to create Smart Features from existing data points. Smart Features are completely new data points that are engineered to answer actionable questions relevant to the financial services industry.

Banks and credit unions are fortunate to have a vast amount of data at their disposal, but for many institutions, that data is not always easily accessible for impactful decision-making. That is why it is necessary to build out a strong data foundation in order to take advantage of both basic business intelligence and more advanced machine learning and AI initiatives. Daybreak by Aunalytics provides the ideal, industry intelligent foundation for financial institutions to jump start their journeys toward digital transformation, with the tools they need in order to utilize data to grow their organizations.


Customer Intelligence

Aunalytics’ Client Success Team Drives Measurable Business Value

Transitioning to a more data-driven organization can be a long, complicated journey. A complete digital transformation is more than simply adopting new technologies (though that is an important component.) It requires change at all levels of the business in order to pivot to a more data-enabled, customer-focused mindset. Aunalytics Client Success is committed to helping organizations digitally transform by guiding and assisting them along every step of the journey, and ultimately, allowing them to thrive.

Below, the Client Success (CS) team has answered some of the most common questions about what they do and how they help organizations achieve measurable business outcomes.

What is Client Success?

Aunalytics CS partners with clients to become their trusted advisor, by building a customized CS Partnership Plan utilizing the client’s unique business needs as the core goals. The CS Partnership Plan creates an exceptional client experience by consistently applying a combination of our team and technology to deliver measurable value and business outcomes for our clients.

What are the main goals of the Aunalytics Client Success team?

The Client Success team has four main goals:

  1. Designing targeted client experiences (by industry, product, and digital transformation stage)
  2. Recommending targeted next steps by simplifying and synthesizing complex information
  3. Delivering proactive and strategic support from onboarding to solution launch, ongoing support, and consulting
  4. Collecting and responding to client feedback on ways our service delivery can evolve

What are the various roles within the CS team?

There are two main roles within the CS team that interact with clients on a regular basis. The first is the Client Success Manager (CSM). The CSM manages day-to-day client tactical needs, providing updates and direction throughout the onboarding process. As the liaison between clients and the Aunalytics team, the CSM synthesizes complex information into clear actions, mitigates any roadblocks that may occur, and clearly communicates project milestones. The CSM works closely with the clients throughout their partnership with Aunalytics, from onboarding, adoption, support, and engagement.

The Client Success Advisor (CSA) works on high-level strategy with each client, translating Aunalytics’ technology solutions into measurable business outcomes. They partner with the clients’ key stakeholders to understand their strategic objectives and create a custom technology roadmap that identifies the specific steps necessary to reach their digital transformation goals. These goals are integrated into the client’s specific CS Partnership Plan to ensure we are aligned on objectives and key results, with clear owners, timelines, and expected outcomes.

How often can a client expect to hear from a CS team member throughout their engagement with Aunalytics?

The CS team is introduced to clients at the initial kickoff meeting and CSMs initiate weekly touch points to ensure onboarding milestones are being met and to communicate action items, responsible parties, and next steps. During these calls the CS team (CS Manager, CS Advisor, Data Engineer, & Business Analyst) will review the project tracker—highlighting recent accomplishments, key priorities, and next steps. Each item is documented, assigned an owner, a timeline, and clear expectations around completion criteria.

What is the Aunalytics “Side-by-Side Support” model and how does the CS team help facilitate this?

Our side-by-side service delivery model provides a dedicated account team, comprised of technology (Data Engineers (DE), Data Scientists (DS), and Business Analysts) and data experts (Product Managers, Data Ingestion Engineers, and Cloud Infrastructure Team), to help transform the way our clients work. The account team collaborates across the company, in service of the client, to ensure that everyone on the team is driving towards the client’s desired outcomes. The CSA captures this information in the CS Partnership Plan to ensure alignment, key priorities, and ownership of time-bound tasks.

The CS team partners with Aunalytics’ Product, Ingestion, and Cloud teams to share client questions, recommendations, and future enhancement ideas. The Partnership Plan is a custom document that evolves with the client’s ever-changing needs. The CSA reviews the Partnership Plan with the client every quarter to capture new goals, document accomplishments, and create feasible timelines for implementation. The goal of the CSA is to create a relationship with the client, in which they view the CSA as a key member of their internal team (e.g. the same side of the table vs. a vendor).

A successful partnership with Aunalytics’ Client Success team is when concrete business outcomes and value are realized by the client, through the use of Aunalytics’ solutions (products + service).

What are some examples of business outcomes that CS has helped Daybreak for Financial Services clients achieve?

In addition to guidance throughout the initial implementation of Daybreak, CS has assisted banks and credit unions with the execution of a number of actionable business cases, such as:

  • Assisting Financial Institutions with implementation of self-service analytics programs;
  • Improving collection and recovery rates on loans;
  • Implementing pattern recognition to make sure that risk and collection departments are efficiently targeting the most at-risk loans;
  • Creating data driven marketing programs to offer personalized services, next-best products, and onboarding. Data-driven marketing allows financial institutions to be more efficient with their marketing dollars and track campaign outcomes better;
  • Integration with 3rdparty software systems.

The Aunalytics Client Success team is instrumental in helping clients realize measurable business value. Together with Aunalytics’ strong technology stack, this side-by-side delivery model ensures that all clients are equipped with the resources they need to affect positive change within the organization and achieve their digital transformation goals.


Daybreak Analytic Database

Daybreak’s Built-in Industry Intelligence Leads to Faster, More Actionable Insights for Banks and Credit Unions

When choosing a piece of technology for your business, it is important to consider technical specs, features, and performance metrics. But that isn’t all that matters. Even though a product or solution may fit all of your technical requirements, it might not be a great fit for your bank or credit union. As an all-in-one data management and analytics platform, Daybreak is a uniquely strong contender on technical abilities alone, but it also offers features specifically engineered to answer the most prevalent and actionable questions that banks and credit unions currently ask, or should be asking themselves, every day. This is made possible through its built-in industry intelligence.

Industry Intelligence Increases Speed to Insights

Daybreak was developed specifically to help mid-market banks and credit unions compete, leveraging the same big data and analytics technologies and capabilities as the largest, leading institutions in the industry. We know that financial services organizations have a wealth of data—but not all of it is actionable, nor even accessible in its current state. We quickly break down data silos and integrate all the relevant data points from multiple systems; including internal and external, structured and unstructured.

It is Industry Intelligent because our experience knows the kinds of questions an institution needs to answer—that is why over 40% of the data in our model doesn’t exist in the raw customer data. These new data points are called Smart Features.

Over 40% of the data in the Daybreak model doesn’t exist in the raw customer data. It is engineered using Smart Features.

Data Enriched with Smart Features Provides Actionable Insights

One huge advantage that Daybreak offers banks and credit unions is automated data enrichment, through the use of Smart Features. Smart Features are newly calculated data points that didn’t exist before. Daybreak utilizes AI to generate these new data points which allow you to answer more questions about your customers than ever before. For example, Daybreak automatically converts unstructured transactional data into structured Smart Features—converting a long, confusing text string to a transaction category. A transaction that looks like this in the raw data…

CHASE CREDIT CRD CHECK PYMT SERIAL NUMBER XXXXXXXX

Is converted to this data point, which can be easily analyzed or used to filter and analyze:

Customer Number Account Number Destination Category Destination Name Recurring Payment
123 456 Credit Card Chase Yes

Another category of Smart Features are new values that are calculated based on existing data. For example, Daybreak’s AI scans transactional data and determine which branch is used most often by each individual. It can look at person’s home address and determine which branch is the closest to their home. The AI also scans transactional data for anomalies, and flags any unusual activities, which may indicate a fraud attempt, or a life change. For example, if an account suddenly stops showing direct deposits, perhaps that customer has changed jobs, or is in the process of switching to a different bank.

Since these Smart Features are automatically created, you can start asking actionable questions right away, without waiting for complicated analysis to be performed.

Daybreak has pre-built connectors to most of the major core systems, CRMs, loan and mortgage systems, and other heavily utilized financial industry applications. This means we can get access to your data faster, including granular daily transactional data, and automatically serve it back in a format that you can use to make data-driven decisions. We’ve figured out the difficult foundational part so you don’t have to spend months of development time building a data warehouse from scratch—you will begin getting insights right away.

Daybreak aggregates data from multiple sources, and allows you to receive actionable insights right away.

Scale your Team’s Industry Experience with Daybreak

One challenge that organizations face with any new initiative is how to transfer knowledge and collaborate across departments or locations. If one team member creates a useful analysis or process, it can be difficult to share with others who may want to look at the same type of information. Daybreak’s Query Wizard makes it easy to share intelligence across your business by providing pre-built queries for common banking questions and insights. We also update the pre-built queries regularly with new business-impacting questions as they are formulated. Lastly, in a click of a button, any query is available in SQL code, providing a huge head start to your IT team in more advanced work that they would like to do.

The Daybreak Advantage

Unlike other “one-size-fits-all” technology solutions, Daybreak has financial industry intelligence and AI built into the platform itself. It allows banks and credit unions easy access to relevant data quickly, and without investing excessive time and money to make it happen. With Daybreak, business users have access to actionable data, enriched with Smart Features, in order to start answering impactful questions they’ve never been able to before. Your entire organization can utilize industry-specific insights and collaborate on data analysis.

Daybreak is a game-changer for banks and credit unions. Start making better business decisions by effectively leveraging your data today.


Financial Services

Solving Data Challenges for Mid-Market Community Banks

Mid-market community banks and credit unions are facing increased expectations from customers and members. As technology advances across all industries, people expect a highly-personalized experience. To provide a customized customer journey, banks and credit unions must utilize the wealth of data that is available to them—basic account information, CRM data, transactions, loan data, and other useful sources.

However, this requires an institution to overcome a number of challenges, including siloed data, poor data quality and governance, use of stale, non-actionable data, manual retrospective reporting, and overall lack of necessary analytics tools and talent. Building a strong, scalable, and automated data management foundation is necessary to realize a digital transformation and ultimately meet customer expectations through actionable analytics insights. Thankfully, these challenges are not insurmountable, and we have outlined the various challenges and put forward guiding principles to help your organization overcome them.

Asking the Right Questions, at the Right Time

Having access to data is powerful, but only if you are asking the right questions. Some organizations have hundreds of dashboards, but few of the insights are actionable. Others have specific questions they’d like to answer, but no easy way to create custom queries or visualizations to dig deeper into the data. It is important to arm all users with the tools necessary to analyze data in order to take action. Users need to be given the ability to utilize relevant pre-built queries or build their own, filter data, segment customers and members, and create meaningful lists. This is also dependent on having a strong, centralized data repository, built for analysis, that gives access to timely and accurate data.

Choosing the Right Technology

While modern core systems and other business applications have their uses, they are not equipped to handle enterprise-wide analytics. These databases systems of record and are meant to handle transactions, and while ideal for collecting and modifying records in real-time, they are not able to meet the needs of querying and analytics. Furthermore, when data is spread across multiple systems, there must be a central repository to aggregate all of this data. A cloud-based next-gen data lake or data warehouse is the ideal option in this situation. They are easily queried and the structure lends itself to analytical applications, including machine learning, predictive analytics, and AI. By building a strong foundation, for BI and analytics, community banks and credit unions make a huge leap towards digital transformation and more closely competing with their larger industry peers.

Breaking Down Data Silos

Financial institutions have no shortage of data. However, that data is usually siloed across many systems throughout the organization. Aggregating and integrating that data is a major challenge that in the best case scenario, can be difficult and time-consuming, and at worst, nearly impossible (such as with transactional data). It can be especially challenging when working with vendor-hosted systems, such as your core, mortgage, loan origination, mobile/online banking, and CRMs. All of this data offers key details into customer behavior, so it is important to utilize all sources to get a complete picture of individuals, as well as the institution as a whole. This is why a singular data warehouse or data lake is essential for analysis and reporting. When all the data from various sources is ingested into this system, via internal connectors and external APIs, it is far easier to query and link the data to gain a 360-degree view of each customer or member, and discover insights you’ve never had access to before.

Ensuring Data Accuracy

Having a wealth of data and insights at your fingertips is only helpful if that data is accurate. Whenever data is entered into a system manually, it is an opportunity for mistakes to be made. If a name is misspelled in one system, it may not be obvious that it is referring to the same person in another system. Valuable insights into that individual’s behavior may be lost. On the other hand, if, in the course of ingesting data from an outside system, there is an error, would it be easy to detect the data loss? One way to mitigate these scenarios is to implement quality assurance technology and best practices. In this way, data discrepancies can more easily be detected and flagged for correction. To take it a step further, data preparation automation tools can be used to correct common mistakes as the data is ingested into the platform.

Using Timely Data

There is nothing worse than making decisions based on stale data. Circumstances can change rapidly, and the ability to be proactive in customer and member relationships is key to providing the personalized experience they have come to expect. For example, if a customer is showing signs of potential churn, banks and credit unions need to know as soon as possible in order to intervene. Transactional databases are changing daily, so it is important to establish a system by which the foundational data repository is updated regularly. For this situation, automation is essential. Manually ingesting data on a daily basis is time-consuming and can be unrealistic for many community banks and credit unions. However, utilizing automated data ingestion and preparation ensures that the data will be updated as often as necessary, with minimal to no human intervention required.

Acquiring the Necessary Talent

Developing a foundational analytics platform is no easy task. It can take huge amounts of time and effort to build an analytics-ready data warehouse from scratch. From planning and strategizing, to actual execution, it can take many months just to get started with any BI or analytics initiative. In addition, it can be challenging, and costly, to recruit and hire data engineers, analysts, and data scientists to run and develop custom algorithms. One way that mid-market financial institutions can save time and effort is to utilize a data platform built specifically for the unique challenges of the banking industry to accelerate the development process. A product that also allows you to utilize pre-built queries, algorithms, and dashboards can also shorten the time to deployment and, ultimately, insights.

Granting Access to All Who Need It

Once the data is compiled, it can be a challenge to get it into the hands of decision-makers across the organization. Will people access data through dashboards? Do they need raw data to perform deeper analysis? Will everyone have access to all data, or will some only need a smaller subset? Using a tool that gives users the ability to interact with data in a variety of ways, be it through code or visualizations, and that gives varying levels of access, is key to managing corporate data governance. Having a centralized data repository also ensures that all users are interacting with the latest, most accurate data.

Daybreak: A Complete Solution for Banks and Credit Unions

While there are a number of challenges to overcome in becoming more customer- or member-centric, it all begins with a strong data foundation. That is why Aunalytics has developed Daybreak, an industry intelligent data model that gives banks and credit unions easy access to relevant, timely data and insights, as the sun rises. Daybreak is an all-in-one analytics solution that automates key tasks—data ingestion across multiple disparate sources, data transformation and preparation, and quality assurance practices—built on a secure, powerful cloud-based data platform, to ensure data is always up-to-date, accurate, and accessible across your organization. It also allows users to connect to any outside source, visualization, or BI tool of choice, or they can leverage Daybreak’s user-friendly, guided Query Wizard and SQL builder interfaces to get to actionable insights.

With Daybreak, anyone across your organization can gain a deeper understanding of individual customers and members, while also acquiring a high-level understanding of the business as a whole. With access to the right data, at the right time, your institution can make better business decisions, faster.


Financial Services

Bank Jumpstarts Journey to Predictive Analytics & AI

Bank Jumpstarts Journey to Predictive Analytics & AI

Provident Bank, a mid-sized bank with $10 billion in assets, is the oldest community bank in New Jersey with branches across two states. Although they have successfully met their customer’s needs for more than 180 years, they knew that they needed to invest in technologies today that would carry them into the future.

Aunalytics is a data platform company. We deliver insights as a service to answer your most important IT and business questions.

Get Started

Featured Content

Nothing found.


Financial Services

Six Stages of Digital Transformation for Financial Institutions

Many financial institutions have been around for decades. They’ve successfully implemented the technology advances necessary to stay relevant, such as using the latest core systems and implementing digital banking services for their customers. However, the journey to a complete digital transformation involves both technical changes as well as strategic and organizational changes. To truly embrace technology and prepare for the future, each financial organization must embark on a multi-phase process to reach their full potential. We have outlined the six stages of this transformation to give banks and credit unions a high-level roadmap of what needs to occur in order to realize a complete digital transformation.

1 | Business as Usual

In the first stage of digital transformation, banks and credit unions are still maintaining the status quo rather than experimenting with new digital initiatives. Some are still hosting their business applications internally and are spending significant amounts of time performing required compliance audits. They manually compile reports using pivot tables in Excel or other spreadsheet programs. While each department has its own reports, there is little to no aggregation of data across multiple departments; only a manually created deck assembled and shared once a month. This means that they are limited to basic reporting metrics rather than deep analytics.

While they may discover some insights in their current data metrics, the insights that are gleaned from these manual reports may not be easily acted upon. Small projects may be taken on by individual departments, but there are no formal processes, and these projects are largely siloed from one another. Many times, the IT department “owns” the digital initiatives, and they tend to be tech-first rather than customer-first. Therefore, organizations are unlikely to see any significant outcomes from the small, one-off projects that are taking place during this stage, and they do not drive the overall business strategy.

2 | Present & Active

As the technology landscape evolves, it can be easy to get excited about new products and services that promise to revolutionize the field. But many banks and credit unions are not ready to go all-in until these claims are tested. Oftentimes, an experimental pilot project will get off the ground within one department. For example, they may start pulling new operational reports out of their core banking system, utilize very basic customer segmentation for a marketing campaign, or consider moving to a cloud-based system to host some of their internal applications.

However, their data continues to be siloed, insights and best practices around new technology initiatives are not shared throughout the organization, and there is little to no executive-level involvement. However, for most banks and credit unions, dabbling in new strategies and technologies is the first step to creating a sense of excitement and building a case for digital transformation on a larger scale.

3 | Formalized

As banks and credit unions begin to see momentum build from existing pilot programs, it is increasingly easier to justify investments into new digital initiatives. In the past, marketing, customer care, and transaction core databases had been siloed; separate reporting for each was the norm. However, in the pursuit of digital transformation, moving internal applications to the cloud is an important milestone on the path to creating a single source of truth and making data accessible across the enterprise.

At this stage, a clearer path toward digital transformation emerges for the bank or credit union. More formalized experimentation begins, including greater collaboration between departments and the involvement of executive sponsors. The development of technology roadmaps, including plans to move systems to the cloud and expand internal or external managed IT and security services, ensures that the bank is strategically positioned to advance its digital initiatives.

4 | Strategic

The pace really picks up in the next stage as collaboration across business units increases and the C-suite becomes fully engaged in the digital transformation process. This allows banks and credit unions to focus on long-term strategy by putting together a multi-year roadmap for digital efforts. This is the stage where an omni-channel approach to the customer journey becomes realistic, leading to a unified customer experience across all touch points—both physical and digital. Technology is no longer implemented for the sake of an upgrade, but rather, to solve specific business challenges.

However, some challenges may present themselves at this stage. As data is more freely accessible, the quality and accuracy of the data itself may be called into question. This accentuates the need for a strategic data governance plan for the bank or credit union as a whole.

5 | Converged

Once digital champions have permeated both the executive team and the majority of business units, it becomes necessary to create a governing body or “Center of Excellence” focused specifically on digital transformation initiatives and data governance across the company. This structure eliminates repetitive tasks and roles, and allows for a unified roadmap, shared best practices, and the development of a single bank-wide digital culture and vision.

Banks and credit unions can further refine their omni-channel approach to optimizing the customer experience by creating customer journey maps for each segment. This leads to optimization of every touchpoint along the way, both digital and traditional, and informs the overall business strategy. Marketing can start to run and track highly personalized campaigns for current customers and new customers.

At this point, one-off tools are abandoned in favor of an all-encompassing cloud analytics platform to gather, house, join, and clean data in order to deliver relevant, up-to-date insights. All employees are trained on digital strategy, and new hires are screened for their ability to contribute in a digitally transformed environment. In the Converged stage, digital transformation touches every aspect of the business.

6 | Innovative & Adaptive

The final stage of the digital transformation journey can be defined by continued experimentation and innovation, which, by now, is a part of the organization’s DNA. Investment in the right people, processes, and platforms optimize both customer and employee experiences, as well as operations of the bank or credit union as a whole.

Through the Center of Excellence group, pilot projects are tested, measured, and rolled out, providing a continuous stream of innovation. The data, reporting, and analytics capabilities of the omni-channel cloud analytics platform are integrated across every department, spreading from Marketing into Sales, Service, and HR, among others. Full personalization of marketing campaigns target customers that have triggers in their checking, mortgage, or wealth management accounts, or through actions taken via customer care or app. This allows the bank or credit union to make relevant recommendations on products such as loans, refi, wealth, etc.

Training programs are set up to bring all employees up to speed on the iteration and innovation cycle, and HR is closely involved in filling the talent gaps. Financial institutions may partner with or invest in startups to further advance their technology and innovation initiatives.

Conclusion

By embracing new technologies and setting up the processes necessary for a complete digital transformation, banks and credit unions are able to personalize the customer experience, enhance and streamline operations, and stay nimble in the face of changing times. No matter where your organization currently falls on this journey, your partners at Aunalytics will help you reach your ultimate goals by providing technology, people, and processes that can take your bank or credit union to the next level.

This article was inspired by research by Altimeter, as summarized in “The Six Stages of Digital Transformation” which can be downloaded here.


Customer Intelligence

Artificial Intelligence, Machine Learning, and Deep Learning⁠

What Exactly is "Artificial Intelligence"?

If you use an automated assistant, make a simple Google search, get recommendations on Netflix or Amazon, or find a great deal in your inbox, then you will have interacted with AI (Artificial Intelligence). Indeed, it seems that every company and service today is incorporating AI in some way or another. But let’s dissect what the phrase ‘Artificial Intelligence’ means.

Most people would agree that AI is not so advanced that these companies would have Rosie from The Jetsons analyzing customer data or Skynet making product recommendations on their store page. And on the other end, at some level it is commonly understood that AI is more complex than simple business rules and nested ‘if this, then that’ logical statements.

Things start to get murky when other phrases, often conflated with AI, are added to the mix. Amongst these terms are Machine Learning (ML) and Deep Learning (DL). One company might say they use ML in their analytics, while another might claim to use DL to help enhance creativity. Which one is better or more powerful? Are either of these actually AI? Indeed, a single company may even use these words interchangeably, or use the overlap of definitions to their marketing advantage. Still others may be considering replacing an entire analytics department with DL specialists to take advantage of this new ‘AI Revolution’.

Don’t get swept up by the hype; let’s shine a light on what these terms really mean.

Teasing out the Differences between AI, ML and DL

These three terms⁠—Artificial Intelligence, Machine Learning, and Deep Learning—are critical to understand on their own, but also how they relate to each other; from a sales team explaining the services they provide, to the data scientists who must decide which of these model types to use. And while it is true that each of AI, ML, and DL have their own definitions, data requirements, level of complexity, transparency, and limitations—what that definition is and how each relate is entirely dependent on the context at which you look at them.

For example, what constitutes Machine Learning from a data acquisition perspective might look an awful lot like Deep Learning in that both require massive amounts of labeled data, while neither look at all similar in the context of the types of problems each can solve or even in the context that examines the skill sets that are required to get a specific model up and running.

For the purposes of this thought piece, the context we will be using will be the case of complexity—how the ability of each of Artificial Intelligence, Machine Learning, and Deep Learning simulate human intelligence and how they incrementally build on each other. This simulation of human intelligence, called simply machine intelligence, is measured by the machine’s ability to predict, classify, learn, plan, reason, and/or perceive.

The interlink between Artificial Intelligence, Machine Learning, and Deep Learning is an important one, and it is built on the context of increasing complexity. Due to the strong hierarchical relation between these terms, the graphic above demonstrates how we at Aunalytics have chosen to best to organize these ideas. Artificial Intelligence is the first of the three terms as historically it originated first, as well as the fact that it is the overarching term that covers all work within the field of machine intelligence. AI, as we use it, can be best described in two ways. The most general case definition of Artificial Intelligence is any technique that enables machines to mimic human intelligence.

Indeed, it may seem that any number of things computers are capable of today could be seen as an AI, although the focus here is not the ability to do math or maintain an operating system⁠—these are not ‘intelligent’ enough. Rather, we are considering such application like game AI, assistive programs like Microsoft’s ‘Clippy’, and expert systems which must predict useful material or actions, classify tasks and use cases, or perceive user and environmental behaviors to drive some action. In short, they display machine intelligence.

The key here is that all of these things perform an activity that we might attribute with human intelligence⁠—moving a bar to follow a perceived ball in the classic video game Pong, classifying that you are writing what looks to be a letter and then provide a useful template, or predict an answer for you based on your current problems. In each scenario, the AI is provided some sort of input and must respond with some form of dynamic response based on that input.

Glossary


Artificial Intelligence (AI): Any technique that enables machines to mimic human intelligence, or any rule-based application that simulates human intelligence.

Machine Learning (ML): A subset of AI that incorporates math and statistics in such a way that allows the application to learn from data.

Deep Learning (DL): A subset of ML that uses neural network to learn from unstructured or unlabeled data.

Feature: A measurable attribute of data, determined to be valuable in the learning process.

Neural Network: A set of algorithms inspired by neural connections in the human brain, consisting of thousands to millions of connected processing nodes.

Classification: Identifying to which category a given data point belongs.

Graphics Processing Units (GPUs): Originally designed for graphics processing and output, GPUs are processing components that are capable of performing many operations at once, in parallel, allowing them to perform the more complicated processing tasks necessary for Deep Learning (which was not possible with traditional CPUs).

Reinforcement Learning: A form of Machine Learning where an agent learns to take actions in a well-defined environment to maximize some notion of cumulative reward.

Sampling: Within the context of AI/ML, sampling refers to the act of selecting or generating data points with the objective of improving a downstream algorithm.

Artificial Intelligence: Machines Simulating Human Intelligence

These kinds of activities are all rule-driven, a distinction that leads to our second, more application based definition of AI: any rule-based application that simulates human intelligence. Rule-based activities possess a very limited ability to learn, opting instead to simply execute a predetermined routine given the same input. The easy Pong AI will always execute the rule provided⁠—to follow the ball – and no matter how long it plays it will only be able to play at an easy level. Clippy will always show up on your screen when it thinks that you are writing a letter, no matter how many letters you write or how annoyed you may get. This outright inability to learn leaves much to be desired to reach the bar of what we would consider human-level intelligence.

Machine Learning: Learning from the Data

This brings us to machine learning. Machine learning is a subset of AI that incorporates math and statistics in such a way that allows the application to learn from data. Machine Learning, then, would be primarily considered a data-driven form of Artificial Intelligence, although rule-driven material can still be applied in concert here where appropriate. Again, the key differentiator is that the algorithms used to build a Machine learning model are not hardcoded to yield any particular output behavior. Rather, Machine Learning models are coded such that they are able to ingest data with labels⁠—e.g. this entry refers to flower A, that entry refers to flower B⁠—and then use statistical methods to find relationships within that data in dimensions higher than would be possible for a human to conceptualize. These discovered relationships are key as they represent the actual ‘learning’ in machine learning. Therefore it is the data, not the code, where the desired intelligence is encoded.

Because of this ability to learn from a set of data, generalized models can be made that do great for certain tasks, instead of needing to hardcode a unique AI for each use-case. Common use cases for Machine Learning models include classification tasks, where a Machine Learning model is asked to separate different examples of data into groups based on some learned features. Examples here are such things like decision trees, which learn and show how best to branch features so that you arrive at a homogenous group (all flower A, or all Churning customer). Another common case for Machine Learning is clustering, where an algorithm is not provided labeled data to train on, but rather is given a massive set of data and asked to find what entries are more alike to one another.

In both of these applications there is not only the opportunity for learning, but continual learning⁠—something that hardcoded, rule-based AI simply cannot do effectively. As more data is collected, there is a growing opportunity to retrain the Machine Learning model and thus yield a more robust form of imitated intelligence. Much of modern business intelligence is built on this style of artificial intelligence given the massive amount of data that businesses now posses.

Limitations of Machine Learning

This is not to say that machine learning is the pinnacle of AI, as there are some severe limitations. The largest limitation to this approach is that we, as humans, must painstakingly craft the datasets used to train machine learning models. While there are many generalized models to choose from, they require labeled data and handcrafted ‘features’—categories of data that are determined to be valuable in the learning process. Many datasets already contain useful data, but in some domains this is much less so. Imagine, for example, you wish to build a machine learning model that can intelligently classify cats from cars. Well, perhaps you pull out the texture of fur and the sheen of cars—but this is a very difficult thing to do, and it is made even harder when one considers that the solution of this model should be general enough to apply to all cats and cars, in any environment or position. Sphynx cats don’t have fur, and some older cars have lost their sheen. Even in simpler, non-image cases, the trouble and time spent constructing these datasets can in some cases cost more than the good they can accomplish.

Crafting these feature-rich, labeled datasets is only one of the limitations. Certain data types, like the case with images we already have described, are simply too dimensionally complex to adequately model with machine learning. Indeed, processing images, audio, and video all suffer from this, a reminder that while these forms of AI are powerful, they are not the ultimate solution to every use case. Indeed, there are other use cases, like natural language processing (NLP) where the goal is to understand unstructured text data as well as a human can, where a machine learning model can be constructed—although it should be acknowledged that there exist more powerful approaches that can more accurately model the contextual relations that exist within spoken language.

Deep Learning: A More Powerful Approach Utilizing Neural Networks

We call this more powerful approach ‘Deep Learning’. Deep Learning is a subset of Machine Learning in that it is data-driven modeling, although Deep Learning also adds the concept of neural networks to the mix. Neural networks sound like science fiction and indeed feature prominently in such work, although the concept of neural networks have been around for quite some time. They were first imagined in the field of psychology in the 1940’s around the hypothesis of neural plasticity, and migrated a time later to the field of computer science in 1948 around Turing’s B-type machines. Research around them stagnated, however, due to conceptual gaps and a lack of powerful hardware.

Modern forms of these networks, having bridged those conceptual and hardware gaps, are able to take on the insane level of dimensionality that data-driven tasks demand by simulating, at a naive level, the network-like structure of neurons within a living brain. Inside these artificial networks are hundreds of small nodes that can take in and process a discrete amount of the total data provided, and then pass its output of that interaction onto another layer of neurons. With each successive layer, the connections of the network begin to more accurately model the inherent variability present in the dataset, and thus are able to deliver huge improvements in areas of study previously thought to be beyond the ability of data modeling. With such amazing ability and such a long history, it is important to reiterate that neural networks, and thus Deep Learning, have only become relevant recently due to the availability of cheap, high volume computational power required and the bridging of conceptual gaps.

When people are talking about AI, it is Deep Learning and its derivatives that are at the heart of the most exciting and smartest products. Deep Learning takes the best from Machine Learning and builds upon it, keeping useful abilities like continual learning and data-based modeling to generalize for hundreds of use cases, while adding support for new use cases like image and video classification, or novel data generation. A huge benefit from this impressive ability to learn high dimensional relationships is that we, as humans, do not need to spend hours painstakingly crafting unique features for a machine to digest. Instead of creating custom scripts to extract the presence of fur of a cat, or a shine on a car, we simply provide the Deep Learning models the images of each class we wish to classify. From there, the artificial neurons begin to process the image and learn for itself the features most important to classify the training data. This alone frees up hundreds if not thousands of hours of development and resource time for complex tasks like image and video classification, and yields significantly more accurate results (than other AI approaches).

One of the more exciting possibilities that Deep Learning brings is the capability to learn the gradients of variability in a given dataset. This provides the unique ability to sample along that newly learned function to pull out a new, never-before-seen datapoint that matches the context of the original dataset. NVidia has done some amazing work that demonstrates this, as seen below, using a type of Deep Learning called Generative Adversarial Networks (GANs) which when provided thousands of images of human faces can then sample against the learned feature distribution and by doing so pull out a new human face, one that does not exist in reality, to a startling level of canniness.

Deep Learning Limitations

Like its complexity-predecesor Machine Learning, Deep Learning has its share of drawbacks. For one, Deep Learning yields results in an opaque way due to its methodology, an attribute known as ‘black box modeling’. In other words, the explainability of the model and why it classifies data as it does is not readily apparent. The same functionality that allows Deep Learning so much control in determining its own features is the same functionality that obscures what the model determines as ‘important’. This means that we cannot say why a general Deep Learning model classifies an image as a cat instead of a car—all we can say is that there must be some statistical commonalities within the training set of cats that differs significantly enough from that of the car dataset—and while that is a lot of words, it unfortunately does not give us a lot of actionable information. We cannot say, for example, that because an individual makes above a certain amount of money that they become more likely to repay a loan. This is where Machine Learning techniques, although more limited in their scope, outshine their more conceptually and computationally complex siblings as ML models can and typically do contain this level of information. Especially as DL models become more depended on in fields like self-driving vehicles, this ability to explain decisions will become critical to garner trust in these Artificial Intelligences.

Another large drawback to Deep Learning is the sheer size of the computational workload that it commands. Because these models simulate, even at only a basic degree, the connections present in a human brain, the volume of calculations to propagate information through that network in a time scale that is feasible requires special hardware to complete. This hardware, in the form of Graphics Processing Units (GPUs), are a huge resource cost for any up-and-coming organization digging into Deep Learning. The power of Deep Learning to learn its own features may offset the initial capital expenditure for the required hardware, but even then it is the technical expertise required to integrate GPUs into any technology stack that is still more often than not the true pain point in the whole acquisition, and can be the straw that breaks the camel’s back. Even with such a large prerequisite, the power and flexibility of Deep Learning for a well-structured problem cannot be denied.

Looking Forward

As the technology continues to grow, so too will the organizing ontology we submit today. One such example will be with the rise of what is known as reinforcement learning, a subset of Deep Learning and AI (specific) that learns not necessarily from data alone, but from a combination of data and some well-defined environment. Such technologies take the best of data-driven and rule-driven modeling to become self-training, enabling cheaper data annotation due to a reduction in initial training data required. With these improvements and discoveries, it quickly becomes difficult to predict too far into the future for what may be mainstream next.

The outline of Artificial Intelligence, Machine Learning, and Deep Learning presented here will remain relevant for some time to come. With a larger volume of data every day, and the velocity of data creation increasing with mass adoption of sensors and the mainstream support of the Internet of Things, data-driven modeling will continue to be a requirement for businesses that wish to remain relevant, and important for consumers to be aware of how all this data is actually being used. All of this in the goal of de-mystifying AI, and pulling back the curtain on these models that have drummed up so much public trepidation. Now that the curtain has been pulled back on the fascinating forms of AI available, we can only hope that the magic of mystery has been replaced with the magic of opportunity. Each of AI, ML, and DL has a place in any organization that has the data and problem statements to chew through it, and in return for the effort, unparalleled opportunity to grow and better tailor themselves for their given customer base.

Special thanks to Tyler Danner for compiling this overview. 


Aunalytics logo

Companies Merge into the New Aunalytics

FOR IMMEDIATE RELEASE

Leading analytics & IT cloud providers unify to align with direction of the future.

SOUTH BEND, IND. (October 1, 2019) – Aunalytics, a leader in data and analytics services for enterprise businesses, announced today that it has unified four entities into the new Aunalytics, adding cloud, managed, and data center services to its offerings.

The newly unified Aunalytics positions the company as a unique leader in the IT market, offering a new category of software and services. Aunalytics is creating a single ecosystem that offers a full-featured, end-to-end cloud platform, capable of everything from traditional cloud hosting, to analytics and artificial intelligence.

This move follows significant growth in Aunalytics’ Midwest footprint as the only provider of a cloud platform with both analytics and production capabilities. The expansion, driven in part by acquiring the cloud infrastructure assets of MicroIntegration in South Bend, IN and Secant in Kalamazoo, MI, has resulted in strong momentum for Aunalytics. Today, the company employs over 180 team members – representing 56 universities, six countries, four branches of the U.S. Armed Forces, and has attracted talent from 11 states specifically to work at Aunalytics.

Secant is now Aunalytics

“The Secant name served us very well for many years, but as we continue to grow across the region and around the country, it is important to evolve the brand to better reflect our people and purpose,” said Steve Burdick, VP Sales, Cloud & Managed Services. “From the most foundational IT needs, to the most challenging AI initiatives, we have an expert team that can help manage every step of a client’s digital journey.”

MicroIntegration is now Aunalytics

“The change of our name represents the merger of multiple companies with unique technology capabilities that when joined together provide a world-class cloud platform with managed services to support virtually any technology our clients need,” said Terry Gour, President & COO, Cloud & Managed Services.

Data Realty is now Aunalytics

“Data Realty’s Northern Indiana data center was one of the first buildings in South Bend’s technology park, Ignition Park, and we’re excited, again, be part of the largest company at the tech park, as Aunalytics,” said Rich Carlton, President & Data Services Lead. “With almost 200 team members and multiple locations across two states, we have more top talent that can manage every step from hosting to AI.”

The Aunalytics Name

The mathematical symbol (U) means union. The letter ‘u” added to “analytics” symbolizes the belief that analytics is not just about data or software-as-a-service. It is about a union between technology and human intelligence, between digital partner and client, between cloud and artificial intelligence. Those that can master these unions, are the companies that will truly thrive and remain competitive.

“We’re living in a data-rich world that is only getting richer. But harnessing that data is extraordinarily difficult for most companies. Businesses need a partner that can provide not just the right systems and software tools, but the people and judgement to implement them strategically,” says Carlton. “They need a catalyst to help them on their journey to digital transformation. Aunalytics is that catalyst.”

For additional information, contact:
Aunalytics
574.344.0881

About Aunalytics

Aunalytics brings together the best of human intelligence and leading digital technologies to transform business challenges into flexible, scalable solutions, measured by definable business outcomes. Through a seamless integration of tailored IT initiatives, secure cloud infrastructure, and big data, analytics and AI solutions, Aunalytics is a catalyst for human and organizational thriving.

###