Blog Archives

Articles via RSS from IBM Big Data Hub

Infuse intelligent automation at scale with IBM Cloud Pak for Data 4.0

When’s the last time you considered if you’re operating in a truly predictive enterprise, furthermore, if it’s easy for your data consumers, models and apps to access the right data? More often than not the answer is a resounding “not very”. Between the proliferation of data types and sources and tightening regulations, data is often held captive, sitting in silos. Traditionally, strategies for overcoming this challenge relied on consolidating the physical data into a single location, structure and vendor. While this strategy seemed great in theory, anyone that has undertaken a migration of this magnitude can tell you it’s easier said than done.

Earlier this year at THINK we unveiled our plans for the next generation of IBM Cloud Pak for Data, our alternative to help customers connect the right people to the right data at the right time. Today, I’m excited to share more details on how the latest version of the platform, version 4.0, will bring that vision to life through an intelligent data fabric.

The journey so far

Since the launch of IBM Cloud Pak for Data in 2018, our goal has always been to help customers unlock the value of their data and infuse AI throughout their business. Understanding the needs of our clients, we doubled down on delivering a first-of-its-kind containerized platform that provided flexibility to deploy the unique mix of data and AI services a client needs, in the cloud environment of their choice.

IBM Cloud Pak for Data supports a vibrant ecosystem of proprietary, third party and open source services that we continue to expand on with each release. With version 4.0 we take our efforts to the next level. New capabilities and intelligent automation help business leaders and users tackle the overwhelming data complexity they face to more easily scale the value of their data.

Weaving the threads of an intelligent data fabric

A data fabric is an architectural pattern that dynamically orchestrates disparate data sources across a hybrid and multicloud landscape to provide business-ready data in support of analytics, AI and applications. The modular and customizable nature of IBM Cloud Pak for Data offers the ideal environment to build a data fabric from best-in-class solutions that is tailored to your unique needs. The tight integration of the microservices within the platform allow for further streamlining of the management and usage of distributed data by infusing intelligent automation. With version 4.0 we’re applying this automation in three key areas:

  1. Data access and usability – AutoSQL is a universal query engine that automates how you access, update and unify data across any source or type (clouds, warehouses, lakes, etc.) without the need for data movement or replication.
  2. Data ingestion and cataloging – AutoCatalog automates the discovery and classification of data to streamline the creation of a real-time catalog of data assets and their relationships across disparate data landscapes.
  3. Data privacy and security – AutoPrivacy uses AI to intelligently automate the identification, monitoring and enforcement of sensitive data across the organization to help minimize risk and ensure compliance.

Register for the webinar to learn more about our intelligent data fabric and how you can take advantage of these new technologies.

Additional enhancements woven into 4.0

Further augmenting the intelligent automation of our data fabric capabilities is another new service coming to IBM Cloud Pak for Data, IBM Match 360 with Watson. Match 360 provides a machine learning-based, easy to use experience for self-service entity resolution. Non-developers can now match and link data from across their organization, helping to improve overall data quality.

IBM SPSS Modeler, IBM Decision Optimization and Hadoop Execution Engine services are also included as part of IBM Cloud Pak for Data 4.0. These capabilities complement the IBM Watson Studio services already within the base and enables users such as business analysts and citizen data scientists, to participate in building AI solutions.

AutoAI is enhanced to support relational data sources and generate exportable python code, enabling data scientists to review and update models generated through AutoAI. This is a significant differentiator compared to the AutoML capabilities of competitors, where the generated model is more of a black box.

Complementary capabilities are also releasing on IBM Cloud Pak for Data as a Service, including IBM DataStage and IBM Data Virtualization. Now available fully managed, DataStage helps enable the building of modern data integration pipelines, and the Data Virtualization capability helps to share data across the organization in near real-time, connecting governed data to your AI and ML tools.

Finally, IBM Cloud Pak for Data 4.0 includes several platform enhancements, most notable of which. is the addition of Red Hat OpenShift Operators. These help to automate the provisioning, scaling, patching and upgrades of IBM Cloud Pak for Data. First time installs are significantly simplified, decreasing the cost of implementation, while seamless upgrades reduce the upgrade process from weeks to hours. Also beginning in 4.0, IBM Cloud Pak for Data is built on a common IBM Cloud Pak platform, enabling standardized Identify and Access Management and seamless navigation across all of the IBM Cloud Paks.

Data is a huge competitive advantage to companies and when combined with AI, has the power to drive business transformation. IBM Cloud Pak for Data’s latest version enables just that, but 10x faster.

Learn more about the latest version of IBM Cloud Pak for Data by signing up for the Data Fabric Deep Dive webinar or by registering for a free trial.

The post Infuse intelligent automation at scale with IBM Cloud Pak for Data 4.0 appeared first on Journey to AI Blog.

Trustworthy AI helps Regions Bank better serve customers

Financial institutions worldwide are feeling the scrutiny from both customers and regulators alike. Perceptions of an institution’s governance practices, including its commitment to ethics, fairness, explainability and transparency of decisions, are critical to its standing. No wonder those poised to gain a competitive advantage today want to ensure their AI is fair, trustworthy, and explainable.

A member of the S&P 500 Index, Regions Financial Corporation is one of the United States’ largest full-service providers of consumer and commercial banking, wealth management and mortgage products and services. This Birmingham, Alabama-based organization has extended its culture of doing the right thing to both its customer relationships and its approach to AI.

In a recent IBM Data and AI Keynote, Trustworthy AI: Forging the future of banking, insurance and financial markets, Manav Misra, Chief Data and Analytics Officer of Regions Bank, detailed the bank’s stringent efforts to build a trustworthy AI framework – and how they’re paying off.

“Trustworthy, transparent models are critical to our success and really go back to our culture and key tenets — “to serve our customers,”’ he said.

As banks, insurance companies and other financial institutions look to innovate with AI, the new currency is trust.  Although the use of artificial intelligence continues to grow across industries including financial services, trust is at a premium and that’s bringing greater scrutiny on AI deployments, according to IBM and Morning Consult’s Global AI Adoption Index 2021. More importantly, the index reveals that 91 percent of businesses using AI say their ability to explain how it arrived at a decision is critical.

Trustworthy AI requires data completeness, accuracy and quality, and the data underlying the models must be representative of the data used to make the decisions. Plus, the models must be “explainable,” meaning their decision-making processes are easily understood. This is especially critical in the highly regulated world of financial services.

Regions wanted to create a trustworthy framework for AI that included ModelOps capabilities and the ability to identify data and model drift.  That meant that it needed tools and processes to monitor data drift and ways to ensure models could be adapted if the data started to change. Misra and his team worked with IBM Data and AI Expert Labs and the IBM Data Science and AI Elite team to align with data tools, methodology and personnel. Part of this effort involved understanding how IBM Cloud Pak® for Data could help them assess data drift, measure model performance, and keep their personnel informed.

Read here about the methodology they used to develop high quality and trusted AI. 

When Misra joined Regions, it was critical to demonstrate the value that data and AI could bring to the business. Rather than starting with a small project, he looked to make the biggest impact quickly.

“I had to make sure that we could show that we could move the needle and deliver large amounts of value to the business,” he said.The first data project Regions built delivered tens of millions of dollars to the business in additional revenue while saving losses. “I used that as a way to demonstrate to other parts of the business: ‘look, we’ve done this, we can do this for you as well.’”

Soon, there was more demand than Misra’s team could meet. “It was something they signed on to and became big proponents of, so much so, that innovating with digital and data is one of three strategic initiatives for the bank right now.”

Misra explained that to create trust in business decisions driven by artificial intelligence, a variety of stakeholders in the second and third line of defense provide oversight into the quality of the company’s models. The result has been trusted data products (including those that help reduce fraud for the bank, assist commercial banker and wealth advisors, and provide insights into consumers) so Regions can better serve customers.

For more insights from Regions Bank, State Bank of India, UBS, CIBC, ING, Rabobank, Citigroup and others, register for the recent Data and AI Virtual Forum: Banking, Insurance and Financial Markets here.

Accelerate your journey to AI by exploring IBM Cloud Pak for Data.

The post Trustworthy AI helps Regions Bank better serve customers appeared first on Journey to AI Blog.

Operationalize AI: You built an AI model, now what?

Global AI Adoption Index 2021 reports the top drivers of AI adoption in organizations are: 1. Advances in AI that make it more accessible (46%); 2. Business needs (46%); and 3. Changing business needs due to COVID-19 (44%). To bring AI models into production, businesses are also mitigating the following AI modeling and management issues:

66%      Lack of clarity on provenance of training data

64%      Lack of collaboration across roles involved in AI model development and deployment

63%      Lack of AI policies

63%      Monitoring AI across cloud and AI environments.

Given the acceleration of AI adoption and the need to solve AI implementation challenges, AI engineering is rising to the top of agenda for technology leaders. Software engineering and DevOps leaders can empower developers to become AI experts and play a pivotal role in ModelOps. This blog will discuss the five imperatives in operationalizing AI that can help r teams boost their chances for success while addressing common challenges pre-and post-deployment.

 Automate and simplify AI lifecycles

Having built DevOps, many software and technology leaders are adept at optimizing the Software Development Lifecycle (SDLC).  More development organizations are expanding the responsibilities of deploying data and AI services as part of the development lifecycle. Advances in automated AI lifecycles can bridge the skills gap, streamline processes across teams and help synchronize cadences between DevOps and ModelOps. By uniting tools, talent and processes, you can build your DevOps practices to be AI-ready and realize returns as you move through Day-2 operations and beyond.

Implement responsible, explainable AI

The disruption caused by COVID-19 and other world events this past year may have pushed consumers past a tipping point: an organizational stance for sustainability and social responsibility is no longer one of the considerations but can be a deciding factor for engaging with a brand or not, let alone buy. Misbehaving models and concerns about AI bias and risk are part of the checklist for go or no-go decisions to implement AI. Further, the evolving nature of AI-related regulations and varying policy responses make responsible, explainable AI implementation one of the top concerns for businesses. IBM donates Trusted AI toolkits to the Linux Foundation AI so that developers and data scientists can access toolkits in adversarial robustness, fairness, and explainability, and help build the foundations of trustworthy AI.

Support model scalability, resiliency, and governance

As discussed earlier, training data is the number one concern in AI development and deployment as it can have a substantial impact on model performance. Collecting, organizing and analyzing a sufficient volume of relevant, high quality data to train models under enterprise constraints can be challenging, especially for distributed, heterogeneous environments. Federated learning enables organizations to achieve better model accuracy by securing model training without having to transfer data to a centralized location, minimizing privacy and compliance risks. A data and AI platform with model transparency and auditability as well as model governance with access control and security can seamlessly integrate with DevOps toolchains and framework.

Run any AI models – language, computer vision and other custom AI models

Successful software development teams are also not only integrating off-the-shelf AI services like chatbots but also building custom AI models to drive real business value. For example, development teams can combine deep learning models for speech-to-text, custom machine learning models predicting the next best offers and decision optimization models for workforce scheduling to be deployed with an app for a better customer experience. Beyond packaged machine learning, businesses can now more easily architect a solution that consists of a diverse set of AI models using language, computer vision and other AI techniques aided by industry accelerators.

Get more from your application, AI and cloud investments

As a development team, you are familiar with the power of innovation in an open, modern environment. By using a modern data and AI platform you can enjoy the flexibility to run your AI-powered applications across various environments—from edge to hybrid clouds—and rapidly move ideas from development to production. Watson Studio on IBM Cloud Pak for Data with Red Hat OpenShift helps you build and deploy AI-powered apps anywhere while taking advantage of one of the richest open source ecosystems with secure, enterprise-grade Kubernetes orchestration. You can start with one use case and build on your success using the same tools and processes. As you take the next steps in the journey to AI, Watson Studio can be a natural fit for building AI in your development and DevOps practices.

Next Steps









The post Operationalize AI: You built an AI model, now what? appeared first on Journey to AI Blog.

How bakery company Vaasan used AI to upgrade their planning

The Finnish baker Vaasan knows a thing or two about fast delivery. After all, the company’s roots date back to year 1849, which makes Vaasan one of the oldest nationwide bakeries in Finland. Vaasan is best known as the producer of Finland’s most popular bread, Vaasan Ruispalat. The company has to be fast, because the baking industry moves quickly. The product is produced in the morning and typically sold by the afternoon. To provide high quality products and service to their customers, large bakeries must rely on very short planning cycles that are informed by various data sources from across the organization.

Today Vaasan has 1,400 employees across Finland and the Baltics, working together to put their fresh bread and pastries on store shelves daily. It takes a lot of coordination to make this happen, and as their business grows, they increasingly rely on advanced planning tools. With automated planning and analytics workflows, Vaasan is able to surface actionable insights across different business departments quickly to continuously improve their operations, 172 years into the company’s history.

The COVID-19 pandemic was, as Joonas Alasaari, Senior Business Controller at Vaasan put it, an ideal “stress test” for the company, especially in its first few weeks as consumer uncertainty peaked. Vaasan’s demand doubled overnight as panicked shoppers cleared shelves of their household staple loaves. It was an opportunity for the company to step back and reflect on how years of increasingly lean operations left them surprised by the unprecedented spike, and what they might do to better face a similar future challenge.

IBM Planning Analytics with Watson is the root of the Vaasan planning process, allowing the company to perform budgeting, forecasting, and other financial processes in one place across their entire organization, alongside a wider range of planning functions like energy consumption management, workforce management and product planning.

Vaasan leaned on Intito, a Nordic technology consulting firm that works with the IBM Business Analytics portfolio. Intito provides clients with integrated business planning, both financial and operational, using tools like AI-infused business analytics. With the help of IBM Planning Analytics with Watson and Cognos Analytics, Intito helped Vaasan use predictive capabilities in three unique ways across their business:

  1. Predicting energy consumption and costs. Vaasan and Intito have built an algorithm that factors in external factors like expected outside temperature, along with production estimates. This gives Vaasan more accurate predictions into their future energy needs, which they can then translate into estimations of energy costs. Prior to developing this model, this data was gathered and processed manually, more slowly and with less accuracy.
  2. Long-term product planning. Vaasan developed a model that uses historic performance data to better determine where their products are situated in their life cycles. This enables planners to rely less on intuition and make more data-driven decisions. The model often gave planners much more advance notice of a product’s decline, giving them more time to course correct or develop replacement products.
  3. Analyzing cost center trends. This model is being tested today and is embedded within Vaasan’s cost centers. It is used to discover anomalies that don’t conform to typical trends. Being able to unearth anomalies so quickly saves planners hours of sifting through data manually at the end of the month.

Vaasan uses IBM Planning Analytics with Watson for monthly estimates, evaluating the current and coming fiscal years, using an integrated model with lots of inputs. With high participation across the organization, the people closest to the decisions are able to provide top-level planners with specific real-time data about current supply, demand, and external economic factors.

As a result, the company is able to operate with less excess capacity, and by extension, higher profit and customer satisfaction. This value has been lying dormant within Vaasan’s data, perhaps for over a century. They only needed the right tools to draw it out.

IBM Planning Analytics with Watson gives planners a central hub from which to gather insights and adjust plans accordingly. To learn more about how it works, watch the full fireside chat between Dave Marmer, VP, Product Management, Regtech, Cognos Analytics and Planning Analytics with Watson, IBM and Joonas Alasaari, Senior Business Controller, Vaasan, from IBM Think.

 Watch now

The post How bakery company Vaasan used AI to upgrade their planning appeared first on Journey to AI Blog.

How do you drive exponential growth in the healthcare industry?

The healthcare industry is adapting to changes resulting from the coronavirus pandemic, but many complex challenges prevail. How do we anticipate and prevent hospitalization of high-risk patients? How can we reduce the length of stay without compromising quality of care? How do we improve patient experience? How do we obtain the insights needed to drive growth, reduce expenses, automate low-value repetitive tasks, and expand our strategic vision?  

Healthcare organizations are searching for an antidote—the answers to these questions. 

Is artificial intelligence the cure? The healthcare sector believes so.  

Forrester Research [1] reports that 69% of healthcare data and analytics decision-makers report that the adoption of AI has had a positive impact on their organization. When 2019 and 2020 survey results were compared, the percentage of healthcare organizations implementing or have implemented AI grew from 55 to 63%. The companies with no interest or immediate plans to implement AI dropped from 21 to 18% [2].   

The need for an AI antidote is not unique to the healthcare sector. Organizations across a wide swath of industries understand that AI is no longer a priority; it is mandatory for success. For those who invest in trustworthy AI, the benefits are clear. AI solutions propel operational efficiencies, grow revenue, drive competitive advantage and improve patient experience outcomes. Despite the industry’s intentions and expectations, many organizations still search for  best practices, technology and ecosystem to reach their AI goals. We will cover these topics at the upcoming IBM Health Forum. 

 Register today for the IBM Health Forum, a virtual event premiering Thursday, June 10, 2021 

Whether you’re exploring AI capabilities or searching for greater efficiencies and possibilities in your existing AI ecosystem, the IBM Health Forum has sessions that will pique your interest and match your needs.  

 The IBM Health Forum is your opportunity to learn from IBM customers and other industry experts who will candidly share their personal experiences and recommendations for business and technology leaders alike. They are healthcare experts in varied roles within the industry. You will hear compelling presentations from healthcare experts in various roles within the industry:  IT and ecosystem technologists, data scientists, and application users from the back office to the hospital floor, – plus executives and industry analysts who tackled their business, technology and organizational challenges using AI.  

During the keynote, guest speaker Dr. Kjell Carlsson, Principal Analyst at Forrester Research, shares insights on the state of data and AI in the healthcare industry. Dr. Curren Katz, Sr. Director of Data Science Portfolio Management at Janssen Research & Development, describes how she and the data science team at Highmark Health used AI models that reduced hospitalization rates for high-risk individuals who could be seriously affected by sepsis. The team was able to then repurpose its work for COVID-19. Dr. David Van Laere, Neonatal Intensive Care Specialist and Founder of Innocens BV, shares how an AI and Internet of Things solution helps save lives. The solution identified and expedited sepsis treatment in very low birth weight infants at Antwerp University Hospital in Belgium.  

And the expert, industry speakers and sessions continue. 

As you register, you’ll find 25 additional sessions categorized into four AI topics: 

  • Integrate AI capabilities into existing workflows
  • Unlock data to unify access to data across the organization 
  • Scale AI to accelerate digital transformation 
  • Modernize with cloud and technology  

Learn why trustworthy AI solutions that are robust, transparent, explainable, fair and private are essential to driving exponential growth in the healthcare sector. 

Register for the IBM Health Forum today. 


[1] Base: 104 Data and analytics decision-makers whose firm is implementing or expanding use of AI in Healthcare, Pharmaceuticals and medical equipment, and Insurance, Forrester Analytics Business Technographics® Business and Technology Services Survey, 2020 

[2] Base: 177, 169 data and analytics decision-makers in Healthcare, Pharmaceuticals and medical equipment, and Insurance, Forrester Analytics Business Technographics® Data and Analytics Survey, 2019 and 2020 

The post How do you drive exponential growth in the healthcare industry? appeared first on Journey to AI Blog.

IBM Planning Analytics delivers continuous integration with Watson

IBM’s Global C-suite study recently validated that data-driven organizations are 178% more likely to outperform their peers in terms of revenue and profitability. It’s no surprise that more and more companies are moving beyond basic Financial Planning and Analysis (FP&A) and toward adopting a mindset of Gartner’s newly-dubbed “Extended Planning & Analysis” (xP&A)—or what we at IBM have been calling it for years: “continuous integrated planning”—to cut through data silos, extract key metrics and business intelligence insights, and ensure that strategic plans and decisions are driven by a holistic approach.

But with the amount of data generated by businesses growing exponentially each year, scaling has become an especially complex issue for companies. And in times of dynamic and rapid change, relying purely on historical patterns for scenario planning is insufficient. It’s no longer just about getting finance teams the data they need or relying on the CFO to improve your bottom line. Today’s decision makers across the organization should be leveraging deep data, deep simulation, and predictive—as well as prescriptive—analytics to optimize financial and operational planning. Sales and operational (S&OP) planning turnaround time tends to be even shorter with tight decision-making windows that provide little room for error. Plans can change overnight and therefore need to be flexible, fast and adjusted in real-time across the enterprise. IBM Planning Analytics with Watson can help.

Our xP&A solution helps you streamline integrated business planning across every part of the organization, automate key processes and augment human intelligence by utilizing predictive capabilities to create more accurate, consistent, and timely forecasts. We’re the only partner that can provide a truly modern, AI-powered planning solution complete with the strongest scenario planning capabilities in the industry, empowering you to plan continuously. Here’s how.

The importance of scenario planning and what to look for in an xP&A solution

 Scenario planning must evolve across your entire organization to include all relevant stakeholders and meet the dynamic nature of the market. As we continue to emerge from the volatility impacts of COVID-19, it is increasingly clear that relying on historical data alone for forecasting, budgeting and strategic decision-making simply isn’t enough. Companies today are rethinking traditional frameworks around planning processes to include automation, machine learning and significantly more robust predictive and prescriptive capabilities that incorporates a wider array of data, including external data such as weather, market indices etc, to drive more accurate predictions and decisions. To thrive during the next expected or unexpected disruption and secure true business value, companies need to embrace continuous planning as a core tenet of their risk, innovation and resilience strategy.

Simulations help to solve problems in real-time by analyzing and understanding the opportunity, risk and alternative solutions.  While scenario planning has traditionally been focused on “what-if” capabilities to produce a range of driver values, IBM’s point of view is that the scenario planning of today requires several critical capabilities for more confident decision making and improved performance, including:

  • What-if: Driver-based planning and modeling efforts that provide users with sandboxes to test in while also incorporating AI/ML to better understand and learn how different variables might impact eventual outcomes
  • Predictive: AI-powered predictive forecasting that operates across multiple variables and can also detect patterns and outliers for more accurate, consistent, and timely forecasts— so you can best predict what’s going to happen (e.g. plan for demand) and where your business is headed
  • Prescriptive: Prescriptive analytics to prescribe the best course of action when using constraints, opportunities and weighted scores to recommend decisions—so you can not only predict what’s going to happen, but also decide how you’ll handle it (e.g. resource management and how to meet demand)
  • Scale: The ability to enable massive data combinations and high participation in scenario planning and ensure data synchronization across multiple applications and models associated with the platform in order to enhance the what-if, predictive and prescriptive capabilities.
  • Advanced “Big Data” management: The integration and transformation of large amounts of financial and operational data that extend fp&a principles to reach the benefits of xp. With Big Data management, massive amounts of data can be ingested from multiple sources into a centralized, governed, real-time planning and analytics platform that offers a “single source of truth”

Why IBM Planning Analytics with Watson

 the predictive analytics edge

With new built-in predictive forecasting capabilities, IBM Planning Analytics with Watson puts the power of algorithmic forecasting in the hands of users — even those without data science skills — for more accurate, consistent, and timely forecasts. Our user-friendly Excel front-end makes adoption easy across line of business users, so the people who need to make micro-decisions throughout the company can do so. Our high-participation platform also allows you to scale for massive increases in volume of data or users synchronized across multiple applications and models. After all, if you can’t scale quickly and efficiently, you can’t really call your strategy “xP&A.”

But a sound strategy also revolves around trust — trust in logic, empathy and transparency, especially when it comes to AI and ML. IBM Watson recently announced new capabilities designed to help businesses build trust in their data and AI planning models. Building trust in models also means providing enhanced explainability, understanding and communication for models and their resulting predictions. That’s why we’re planning to bring a new statistical details page to IBM Planning Analytics  with Watson in Q2 2021 to provide more transparent and easy-to-understand facts about how a forecasting prediction was generated. Too often, planning and analysis can feel like a black box of information privy only to the on-the-ground analysts who generated the plan. But with true extended planning and analysis, it’s critical that all areas of the business have input and insight into the process in order to better understand how a plan or forecast was created. And as IBM battles against unjust bias in AI, this kind of transparency is more important than ever to not only getting it right, but doing what’s right.

The built-in predictive capabilities in Planning Analytics with Watson helps departmental or line of business users leverage predictive capabilities; however, we understand that businesses are complex and that is why we’ve introduced enhanced integration with IBM Watson Studio for Predictive Analytics and Decision Optimization.  Watson Studio’s modeler now includes more advanced predictive capabilities that allow the user to customize forecasts by leveraging newer, multifaceted algorithms, methods and variables. Users can now work hand and hand with their data scientist to build more complex models that reflect their business. Additionally, this allows organizations to factor in competing goals and priorities, find the optimal answers within a given set of constraints, and explore more scenarios to arrive at the best possible outcome. In short, the combination provides more confidence in your continuous planning process.


See how some of our clients are benefiting from IBM Planning Analytics with Watson

How bakery company Vaasan used AI to upgrade their planning

To provide high quality products and services to their customers, large bakeries like Finland baker Vaasan rely on very short planning cycles that are informed by various data sources from across the organization. When the COVID-19 pandemic first hit, Vaasan’s demand doubled overnight, putting their supply chain under significant pressure. But with the help of predictive analytics, they are able to operate with less excess capacity, as well as predict energy consumption and costs, and build long-term product plans. As a result of Planning Analytics with Watson, Vassan is seeing higher profits and customer satisfaction. Vaasan is also currently building a model being tested today to analyze cost center trends in order to quickly save planners hours of sifting through data manually at the end of the month.

See how IBM Planning Analytics with Watson helped–>

Vapo Oy integrates AI to transform Finnish energy business

February is usually Finland’s coldest month, with temperatures averaging from -22 to -3°C (-7.6 to 26.6°F). For the communities that Vapo serves, having reliable heat is critical – especially in Lapland, where winter temperatures can drop to a staggering -50°C. Because of the high stakes environment, strategic planning and ERP best practices have long been central to Vapo’s mission.  But when the Finnish government enacted stronger environmental laws to combat the climate crisis in 2019, Vapo needed to adapt.

See how IBM Planning Analytics helped Vapo shrink their carbon footprint–>

Ancestry: Makes planning on the IBM Cloud part of their DNA

As interest in genetic background information has grown exponentially in recent years, Ancestry needed a planning tool that could grow with them—and maintain stability throughout. With over 10 million DNA customers, identifying new methodologies to help Ancestry scale efficiently and ensure 24/7 performance management was critical to success.

See how IBM Planning Analytics helped–>

Today’s top business leaders are embracing a holistic view of planning and analytics. Business planning is no longer seen as a task relegated solely to the finance department, but as a company-wide mindset and strategy. In order to benefit from this xp&a evolution, don’t forget to take an especially close look at the depth of your offerings’ scenario planning features, as outlined above. To learn more about IBM Planning Analytics, reach out to us today.

The post IBM Planning Analytics delivers continuous integration with Watson appeared first on Journey to AI Blog.

You could be paying less for software licensing

High licensing and maintenance fees Gartner, Inc. defines the TCO for enterprise software as the total cost an organization incurs to use and maintain software technology over time. To calculate TCO, companies consider direct costs, such as hardware, software and administration, and indirect costs, including human resources, project management and downtime. But there’s one key cost that is often overlooked or underestimated: ongoing licensing and maintenance fees. Why does licensing get neglected? Perhaps it’s because some vendors have made their licensing requirements and fee structures so complex that they’re hard to understand. Maybe it’s related to license term lengths: after three or four years, fees are not as top of mind as they were at purchase time. Or, sadly, perhaps it’s because people are simply used to the pain of paying high licensing fees and have accepted them as the cost of doing business. It’s time to stop the pain.

Take a stand against high fees

Consider the case of Siteco GmbH, a lighting solutions and technology innovator that’s been illuminating the streets, cities, industries and stadiums of Germany and beyond for 150 years. Until recently, it was part of a much larger organization, a global technology company. Both organizations relied on Oracle database technology to support their SAP applications and capitalized on the size of their enterprise to negotiate licensing fees with Oracle.

But that changed in 2018 when Siteco was sold to Stern Stewart Capital. Siteco moved its IT to a data center environment. It also lost some of the benefits of being part of a large organization, such as enhanced negotiating power with Oracle to reduce its licensing and maintenance fees.

For Siteco to succeed independently and paying high Oracle licensing and maintenance fees wasn’t an option. The company sought a way out.

Seeing the light and reaping ROI

Siteco adopted an IBM Db2 Database in early 2020. ROI started on Day 1.

Db2 contributed to a 7-digit reduction in TCO and sped ROI in several ways. For starters, Db2 gave Siteco some breathing room and flexibility when it came to licensing. Consider this: Oracle bases its licensing fees on the number of processor cores a customer uses. If customers want to improve performance, they have to add more cores, so performance can come at a hefty price.

By contrast, Db2 offers subcapacity licensing in a virtualized environment. For instance, if you have a 100-core server but Db2 uses just one core, you only pay for that one core, not all 100. With Oracle, unless you pin a virtual machine to a specific number of cores in a 28-core server, for example, you can be charged a license fee for all 28 cores.

Using Db2, Siteco reduced its number of cores by 50%. How? With better compression. The company had already compressed its Oracle databases, but Db2 compressed them even more. Db2 uses deep compression technology to minimize storage space and improve database performance. Ultimately, the company reduced storage by 54%, which, in turn, reduced backup and storage costs.

And speaking of performance, Siteco runs its SAP batch jobs 27% faster and its web services jobs 30% faster than before.

With Db2, there are no hardware limitations, so you’re free to build additional systems without additional cost. You can deploy the systems wherever you want, be it on premises, cloud or mixed environments. You also have the option of containerizing the Db2 workloads and taking advantage of IBM Cloud Pak for Data by using the Db2 extension for IBM Cloud Pak for Data. With IBM Db2 on IBM Cloud Pak for Data, you benefit from a fully integrated data and AI platform that modernizes the data management.

Outstanding results are contagious

The cost savings, performance gains and ROI realized by Siteco haven’t gone unnoticed — Other executives are now exploring Db2 too. Who can blame them?

Read about how Db2 Database is slashing TCO and boosting performance for other companies, such as Owens-Illinois, Inc. and Audi AG.

What can Db2 Database do for your business? To find out, book a consultation or talk to an IBM representative or IBM Business Partner. You can also read an analyst’s in-depth cost/benefit comparison between Db2 and Oracle.

The post You could be paying less for software licensing appeared first on Journey to AI Blog.

How can you make modernizing your data and AI architecture simpler?

IT architectures have witnessed an increasing amount of dispersal and segmentation over the last decade of their life cycle as new data and new technology have made their impact. Thus, many are seeking to modernize and optimize their current data and artificial intelligence (AI) architecture in an effort to address the lack of cohesion and widespread data repositories as well as data and analytics business needs. Of course, doing so can also be difficult without the right planning and experience which is leading to abandoning these modernization efforts or not starting them at all rather than high-performance ecosystems with automation intended to drive better AI models and customer experiences. Fortunately, modernizing no longer needs to be that difficult. IBM has not only developed IBM Cloud Pak for Data, a data and AI platform, but a Modernization Factory experience to deliver the planning and expertise crucial to modernization success.

Greater flexibility and integration from data management to machine learning

First, let’s take a look at the concept of a data and AI platform like IBM Cloud Pak for Data, which is a truly hybrid platform that can be deployed anywhere, on any cloud. Current customers of standalone IBM Db2, IBM DataStage or external products like them may question the need to move to a cloud platform if current systems are operating well. While understandable, it misses the potential to be more efficient and effective as industries race toward even greater usage of machine learning and artificial intelligence. The benefit of a platform is that data management, data governance or DevOps, data science, machine learning, artificial intelligence and a host of other tools – even open source – work in concert together to produce better results for decision making than any could alone.

A key example is the data fabric, which connects multiple data sources and modern data sets through data virtualization, enabling them to be accessed and governed at a single point for enhanced self-service data access by data scientists and business users. This is true whether it happens to be big data in a data lake, real-time streaming IoT data, or more traditional data in a database or data warehouse. It’s data integration without data movement but with DataOps embedded directly at the source. Moreover, the flexible licensing options and quick provisioning help future proof the architecture for a variety of machine learning and artificial intelligence use cases, allowing new opportunities to be seized rapidly with the quick addition of new capabilities. IBM Cloud Pak for Data also recognizes the inherent complexity of the deployment environment and is built to make deployment in a hybrid, multi-cloud, multi-vendor environment as easy as possible. This is accomplished by running on Red Hat OpenShift.

The advice and tools you need to make modernization and optimization easy

Even when the decision to modernize with a data and AI platform has been made, the thought of actually undertaking a digital transformation can be concerning. Making the move with as little disruption as possible without losing anything in the process is a key concern. In recognition of these concerns, IBM Cloud & Cognitive Expert Labs have designed the Modernization Factory experience to help ensure things go smoothly.

As part of Modernization Factory, IBM experts will work with the customer to better understand their current landscape, use case, functionality and the business opportunity they’re trying to realize. They’ll also engage in technical discovery, which includes building an inventory, assessment and roadmap. Various mobilization and planning exercises are also used. From there, the solution is installed and provisioned, a workload test is run, and any remaining adoption and implementation activities are completed.

Whether you’re moving from an on-premises implementation to Cloud Pak for Data or Cloud Pak for Data as a service, Db2, DataStage and IBM Cognos benefit immensely from the Modernization Factory process with benefits like:


  • The ability to containerize databases in minutes
  • No exposure of raw data
  • Maintaining full database integrity


  • Automated assessment / unit-testing
  • Automated modernization of workload
  • Automated conversion of enterprise stages to CPD connectors


  • Automated conversion of security setting to CPD native security
  • Automated modernization of workload

How to start modernizing

There has never been a better time to modernize your architecture with a data and AI platform. The era of AI demands a flexible interconnected data architecture and the experts at IBM are ready to help you through the process. To learn more about the importance of upgrading, read our white paper Upgrade to agility: The value of modernizing data and AI services to IBM Cloud Pak for Data.

The post How can you make modernizing your data and AI architecture simpler? appeared first on Journey to AI Blog.

Making Data Simple: What does Legacy Powers Legendary mean?

The post Making Data Simple: What does Legacy Powers Legendary mean? appeared first on Journey to AI Blog.

Building an AI framework for fair hiring: A U.S. employer puts antibias first

Data science can quickly turn data into insights and those insights can lead to decisions. And sometimes, the results are unwittingly spoiled by bias and drift, causing mistrust. This problem undoubtedly hampers AI adoption and can negatively impact people’s lives and a company’s reputation.  

Take hiring decisions. 

Tools or recruiting systems that screen candidates have long demanded attention; as research has demonstrated, they can reflect historical discrimination based on the datasets.  

Sensitive features such as gender, ethnicity, and age, even if not included in AI, could have influenced the training of the data, the source of the data, and even how the data got to the AI from a training dataset. In other words, even if there’s no intent or access to those features in the beginning, those perceptions can lead to incorrect decisions. 

A growing concern about AI’s trustworthiness has provoked worldwide conversation among data leaders and business leaders alike about how to improve the practices of trustworthy AI and govern it across the AI lifecycle.  

How do we understand what AI models are doing?  

How do we ensure AI accuracy and fairness?  

How do we speed up production and adoption of AI models?  

Can we trust the output? 

According to IBM, if a business is involved in making decisions on automation that’s driven by AI, it needs to be transparent. The business must know it’s making decisions that align with company policy — and that people who are making the decisions based on AI can trust it. 

One major U.S. company was eager to tackle the problem on a large scale and turned to IBM for help. Within this corporation’s mandate to focus on social responsibility has been an effort to drive more workforce diversity and inclusion. When it came to its hiring practices, it was critical that this employer ensure fairness and trust was in place within its AI and ML models – especially when it came to attracting and recruiting talent. 

With over 1,000 data scientists in its ranks, this industry leader has traveled far on its AI journey. Hundreds of ML models were in production, but what it lacked was an enterprise solution that assured that models could be trusted in a socially responsible manner.  

Data science leaders wanted to be able to translate the models’ decisions and results easily — in a way any hiring manager could understand. It wanted to establish fairness by accelerating the identification of any bias in hiring and “explain” decisions made by AI models. The company also knew it needed to operationalize AI governance to get more of its business users on board – so it set out to find a solution that could achieve all of these things.  

The answer was IBM Watson®  OpenScale(TM), a AI monitoring and management tool within IBM Cloud Pak® for Data that filled a much needed gap. Once IBM’s Data Science and AI Elite team showed how the product could consistently manage AI models for accuracy and fairness, IBM’s Expert Lab services came in to drive the ongoing teamwork needed to reach the corporation’s goals. 

One team solving the problem together 

Since partnering closely with IBM, the company has been tapping IBM’s Expert Lab services to implement IBM Watson OpenScale on Cloud Pak for Data in several use cases, relying on IBM’s expertise for this area of the AI lifecycle. The partnership has resulted in the creation of a enterprise framework that can operate within the scale of the enormous organization. Today the customer has all the capabilities it needs to manage aspects of bias, fairness, accuracy, drift, explainability and transparency in its use of AI and machine learning. 

Now, the company is proactively monitoring for and mitigating bias in its hiring processes. Because automation has reduced the workload within DevOps, the company’s data scientists can focus more on the new model development and refinements.  

Today, companies across all industries have a clear opportunity to harness data and AI to build effective and scalable solutions while eradicating systemic racism and structural inequality. And there’s no denying the fact that there’s a relationship between higher growth and the ability to scale AI with repeatable, trustworthy processes. According to a January 2020 Forrester Consulting study commissioned by IBM, Overcome Obstacles to get to AI at scale, the companies who are the fastest growing in their industries are over 6x times more likely to have scaled AI.  

There’s no better time to address the societal relevance of AI and the need for a trustworthy AI framework based on ethics, has governed data and AI technology, and is rooted in a diverse and open ecosystem. 

Find out how by visiting 

Accelerate your AI journeywith a prescriptive approach. 

The post Building an AI framework for fair hiring: A U.S. employer puts antibias first appeared first on Journey to AI Blog.