Blog Archives

Articles via RSS from IBM Big Data Hub

Weaving the data fabric through IBM Global Financing

When your IT infrastructure suffers an outage, the last thing you want is high visibility and—even worse—revenue loss. But that’s exactly what happened to IBM in 2017. An aging and disparate internal IT ecosystem that had been starved of investment created a perfect storm, and in 2018, I was brought in to lead an ambitious modernization effort that had been long overdue.

As Director of IT for IBM Global Financing, I lead a 600-person team responsible for a diverse portfolio of more than 70 applications, some with millions of lines of code accrued over decades. This complexity is compounded by the rigor and controls that IBM Global Financing must uphold. As one of the world’s largest IT captive financiers with an asset base of $25 billion, we provide loans, leases and asset recovery services to IBM customers and business partners. We’re also registered as a bank in four countries. Upgrading our infrastructure is critical for IBM’s business and for the clients we finance across 60 countries and 20 industries.

When I was planning out my vision for modernization, I grappled with challenges that are common in CIO organizations: Not only were there legacy systems, but there was a legacy culture to simply “keep the lights on.” Additionally, many approaches to solving IT challenges were reactive rather than proactive—we made fixes after users alerted us of problems. Consequently, it became eminently clear that for IBM Global Financing to succeed, we needed to improve efficiencies, flexibly respond to change, and better predict outcomes through more access to data for all our users.

In this blog, I’ll discuss how we are achieving these goals by using IBM Cloud Pak for Data, a unified platform for data and AI, and building a “Cognitive Headquarters” with its data fabric.

The value of a data fabric

In the past, we used a traditional data warehousing approach. A myriad of transactional systems fed data via disparate interfaces into an information warehouse that various users queried. If one interface didn’t sync to the others, a damaging ripple effect could ensue.

With a data fabric, we can eliminate the need to manage and monitor different data interfaces, and we can empower our stakeholders to use the same tools. This architecture provides access to the right data at the right time, agnostic of location or deployment method. As a result, we can simplify data management and governance in complex hybrid and multicloud landscapes. Now, different groups won’t have to maintain multiple data warehouses for distinct purposes, which will reduce both risks and costs.

Once we have a data fabric in place, we can turn to another question: What do we do with all that data?

The short answer is that IBM Global Financing is building and infusing AI algorithms into business processes. On a unified platform, having built-in AI tools that connect with the advanced analytic capabilities of a data fabric allows us to create and deploy models faster. And we’re finding this seamless integration in IBM Cloud Pak for Data.

Use cases of IBM Cloud Pak for Data at IBM Global Financing

IBM Cloud Pak for Data and its services have already helped IBM Global Financing succeed in various ways:

  • Data virtualization helped modernize our sales portal, making data integration significantly faster and more efficient. In addition, predictive logic helped our sellers prioritize their workloads by surfacing which opportunities are most likely to close with financing.
  • In our accounts receivable function, data science and machine learning capabilities in IBM Watson Studio allowed us to predict whether loan payments would be on time. This yielded tangible results with collectors taking proactive actions based on prior client activities.
  • Server space can be an issue, and historical data analyzed on IBM Cloud Pak for Data helped us create a predictive support process. When a server is forecasted to reach capacity, we can start proactive remediation. As AI models get more refined, my vision is that we can eventually run automated scripts to fix problems without human intervention—achieving the holy grail of a system that heals itself.

With the data fabric on IBM Cloud Pak for Data, we’ll look to consolidate these use cases and many more through a cohesive and intuitive front-end experience for all our business functions.

Extending the data fabric vision: A Cognitive Headquarters

A Cognitive Headquarters is the next step in our IBM Cloud Pak for Data journey. This portal will become the single point for consuming all IBM Global Financing data simply and quickly. The data fabric will help us overcome challenges arising from outdated, fragmented and inaccurate business reporting. We could migrate data to the cloud, build an access page to all data across multicloud and/or hybrid environments, and provide user-friendly ways to retrieve and derive insights.

With the data fabric, we’ll not only modernize our legacy systems but pave the path toward greater AI-powered innovation.

We are more excited than ever that IBM Cloud Pak for Data is the future of many more data and AI use cases to come!

Next steps

Explore the benefits of a data fabric on the web and in this report.

Learn more about the services that IBM Global Financing offers.

The post Weaving the data fabric through IBM Global Financing appeared first on Journey to AI Blog.



Five best practices to improve compliance with AI regulations and standards

In the recent decade, the value of Artificial Intelligence (AI) has been demonstrated in many industries. These trends have increased the interest of many organizations in AI technologies, not just to streamline business operations, but also to gain a competitive advantage.

However, the deployment and use of AI to support business operations may present significant risks to individuals, groups, and even society, if not managed according to clear principles and practices, such as those represented in particular sets of authoritative rules. That management would mitigate the possibility that AI could infringe upon fundamental rights of the individuals and groups subjected to it. For example, screening resumes for prospective employment candidates using AI that is biased towards specific genders or ethnicities would clearly be unacceptable.

Authoritative rules come in different forms and have diverse application domains. They can be:

  • Government laws and regulations, such as Illinois’s new Artificial Intelligence Video Interview Act.
  • International standards, such as the ones in development by ISO AI working groups, or technology-focused standards, such as the IEEE standards for AI.
  • Internal organizational rules based on principles, policies, or procedures.

Therefore, to scale the deployment and use of AI, organizations should establish a compliance management program – one that addresses relevant requirements from applicable AI authoritative rules. Such a program constructs guardrails around the use of AI so it is consistent with the organization’s principles and values, as well as with its stakeholders’ expectations and demands.

Key practices to consider in establishing an AI compliance program

  1. Develop a global view of AI compliance

The complexity and ever-changing nature of the authoritative rules an organization must follow can be overwhelming. In addition, introducing new AI rules may negatively impact the state of compliance to some pre-existing rules.  It is, therefore, more effective for the organization to handle AI compliance in a systemic way to allow for a consistent compliance approach across the organization and leverage appropriate controls to meet applicable requirements.

  1. Get involved in AI compliance intelligence

Updates to existing AI authoritative rules and the emergence of new ones may require significant changes in the way an organization has set up its compliance program and controls. In addition, adapting to new AI requirements may introduce a level of complexity that is beyond what the organization has been prepared to take on.

To efficiently adapt to these changes, organizations should proactively monitor the development and modification of the relevant AI authoritative rules.

  1. Enable an AI compliance mapping capability

When an organization is subject to several AI authoritative rules, it is often difficult to narrow down the full set of AI requirements that should be addressed in a specific context. This is because mapping requirements from AI authoritative rules that are issued by different sources for different jurisdictions is a complex task that crosses several expertise areas (e.g., AI, privacy, security).

Using in-house or third-party outsourced resources, such as IBM Promontory Services, organizations can map out common AI requirements that need to be fulfilled consistently and supplemental ones that can be addressed as needed.

  1. Invest in AI compliance enablement

An effective AI compliance management approach includes a clear communication of the right practical steps to realize AI compliance objectives.

Organizations should develop appropriate process enablement and education activities to help employees understand their organization’s AI compliance objectives, their role in meeting those goals, as well as how to proceed in practice.

  1. Enforce AI compliance positively

Enforcing AI compliance, using relevant technical and organizational measures, is critical.

A positive compliance enforcement approach, based on promoting trust and transparency rather than overemphasizing verification, is often more effective because it allows the organization to get the full support of its employees to meet AI compliance objectives.

How technology can help

Technology plays an important role in supporting an effective AI compliance program. For example, it can help:

  • Manage the set of authoritative rules and underlying requirements allowing an efficient mapping of the requirements to determine appropriate compliance objectives and controls.
  • Enable employees to make the right decisions and take the appropriate actions to meet compliance objectives and controls, while managing associated risks efficiently.
  • Measure and monitor compliance progress and report on that as needed in a transparent way.

With an effective AI compliance program, sponsored by leadership and endorsed by employees, companies can achieve the compliance needed to infuse trustworthy AI throughout the enterprise.

For more information:

The post Five best practices to improve compliance with AI regulations and standards appeared first on Journey to AI Blog.



IBM Watson Knowledge Catalog and IBM OpenPages with Watson recognized for outstanding design

IBM always strives to provide an outstanding experience for our users, which is why it’s such an honor to be recognized for our exceptional designs. IBM Watson Knowledge® Catalog and IBM OpenPages with Watson® have been awarded with a Red Dot 2021 Award in the Brand and Communication Design: Interface and User Experience Design category.

The Red Dot Award is one of the most competitive design awards in the industry — in 2021, over 20,000 entries from all over the world competed for the awards. Being selected by their jury — a panel of 23 design experts with international recognition — is a huge accomplishment, and IBM is delighted to have multiple teams recognized for their high quality designs.

What is IBM Watson Knowledge Catalog?

IBM Watson Knowledge Catalog (WKC) powers intelligent, self-service discovery of business assets like data, models, and reports. Through multiple user testing sessions, the team learned that data professionals spend 80% of their time searching for high quality, reliable data. This leaves them only 20% of their time for their actual work, like data analytics or building and training AI.

Data by nature is technical and hard to understand by non-technical people. Business professionals are spending time sending questions and waiting for explanations from their technical staff. Most data is sitting unused in department silos –forgotten or too confusing to understand what it represents. Once data professionals do get this data, there’s often no security around the use of it, putting the business and users at risk.

The catalog experience eliminates these barriers. WKC is designed to index data with a common set of business terminology, provide clear ownership, and create an inventory of sensitive information to allow organizations better access and more a efficient usage of their data.

WKC has designed a way to help organizations identify and keep an inventory of their sensitive data. Masking data protects the identities of thousands of users and also helps them to prepare for data privacy regulations like GDPR.  WKC’s design prioritizes data protection and increased the productivity of data professionals by four times through better management of their projects.

Information can be properly governed, protected, and easily accessed by data professionals 

We have also designed a summarized data lifecycle view, which tracks how data moves through a company.  Knowing the original source of the data, how it has been transformed, and where it is ultimately used is critical information for a data consumer. The explorative graph helps them to get a holistic 360 degree picture of their data provenance in moments rather than hours. By calculating data quality and including the score in the lineage view, along with other important information such as data ownership and business terms, we help the data consumer understand their data more clearly.

A graphical, interactive view of lineage, supports exploration and understanding of the data life-cycle, and appropriate trust calibration

WKC was designed as a direct result of real pain points encountered in our user’s day-to-day job. We continue to evolve the user experience and collaborate with our users to ensure companies are able to make protected, high quality, and trusted data available by self-service. The team created this award-winning experience by steering the process towards matching user goals, and resulting in data professionals receiving 80% of their time back for more important tasks like data analytics or building of AI systems.

If you would like to learn more about IBM Watson® Knowledge Catalog, you can watch this 2 minute video https://youtu.be/kKAfQJQnW8c or visit ibm.com/cloud/watson-knowledge-catalog.

What is IBM OpenPages with Watson?

Many organizations are struggling with the increased complexity of governance, risk, and compliance (GRC) regulations within a business environment. The risk of not complying with these regulations can result in hundreds of millions of dollars in fines, the loss of consumer trust, and damage to the brand.

Our user research has shown us that traditional GRC systems are largely directed towards the convenience of a second-line of professionals, GRC team members who are trained to work within these complex environments. First line business users with less GRC compliance training find these systems unmanageable, hindering their ability to spot and respond to risks before they impact the business.

A user-focused, end-to-end GRC solution is required to foster complete participation in GRC activities across the entire organization. IBM OpenPages with Watson has created an AI-driven GRC platform that breaks down risk silos and empowers the first line of business users to make well-informed risk-management decisions. Through data classification, control mapping, Watson Assistant and end-to-end data governance, OpenPages is able to address these key issues.

Natural language processing (NLP) in OpenPages understands and categorizes data automatically so business users will be able to select the correct risk categorization. This can improve the speed and accuracy at which the data is classified on IT incidents or Basel event type categories. NLP also helps business users detect whether an existing control can be mapped to record the organization’s compliance with the rule, reducing the time spent on understanding the applicability of new rules or regulatory changes.

Control Mapping reduces the chance of duplicate internal controls in relation to specific obligations within a regulation

IBM OpenScale aids organizations to validate pre-production AI models and monitor product AI models to ensure they can be trusted to perform as intended. The built-in integration allows IBM OpenPages with Watson to automatically receive key metrics to analyze a model’s interdependence and performance and store documentation of model validation test results.

As per recent privacy regulations, privacy stakeholders need to have a holistic view of sensitive data across their organization to be able to demonstrate compliance to regulators. With integration with IBM Watson Knowledge Catalog, OpenPages Data Privacy Management performs privacy assessments and compliance reporting to allow data professionals the confidence they need to build models which will be in compliance with the relevant regulations.

Watson Assistant allows new users with limited knowledge around GRC to easily get help while completing tasks

If you would like to learn more about IBM OpenPages with Watson, visit the link below: https://www.ibm.com/products/openpages-with-watson

The post IBM Watson Knowledge Catalog and IBM OpenPages with Watson recognized for outstanding design appeared first on Journey to AI Blog.



Leading with trust will differentiate companies

AI is transforming how businesses operate and engage the world, delivering the power of prediction to augment human decision-making. However, humans must trust predictive data recommendations and outcomes for AI to realize its full potential. The recurrent question: how to build and create trust in data.

At IBM’s recent Chief Data and Technology Officer Summit, where we had a record number of attendees, we were joined by CDOs, CTOs, and COOs across Government, Technology, Healthcare, Digital, and Automotive industry to discuss the challenges surrounding this topic and get to know how they are building trust within their organizations. In addition, we shared IBM’s approach to helping our clients achieve greater trust, transparency, and confidence in business predictions and outcomes by applying the industry’s most comprehensive data and AI governance solutions.

Trust in Data, Trust in AI

For more than a century, IBM has been committed to the responsible stewardship of data and technology. Trust is part of our DNA, and we’ve been operationalizing our values and principles for AI governance through the AI Ethics Board. Today, IBM is a leader in advancing global progress around ethical AI – from principles and policy advocacy to putting it into practice, and we continue to engage stakeholders worldwide as they explore the critical questions posed by the advancement of AI, to ensure that its full potential for positive impact can be reached. You can learn more about how to advance AI ethics beyond compliance in this recent IBV Study.

To build trust in new AI-driven technologies, we must start with ensuring the right data and AI foundations are in place. Building trust begins with governance to ensure that data and AI can be trusted. Data must be accurate, accessible, governed, secure, privacy respected, and relevant. Organizations recognize that it takes a holistic approach to manage and govern the AI solutions across the entire AI lifecycle. That’s why IBM continually brings innovative governed data and AI technology and approaches to market that are built on five focus areas: transparency, explainability, robustnessfairness, and privacy.

  1. Transparency

    Users must be able to see how the service works, evaluate its functionality and comprehend its strengths and limitations. Transparency reinforces trust, and the best way to promote transparency is through disclosure. Transparent AI systems share information on what data is collected, how it will be used and stored, and who has access to it.

  2. Explainability

    Any AI system on the market that is making determinations or recommendations with potentially significant implications for individuals should be able to explain and contextualize how and why it arrived at a particular conclusion.

  3. Robustness

    AI-powered systems must be actively defended from adversarial attacks, minimizing security risks and enabling confidence in system outcomes. Robust AI effectively handles exceptional conditions, such as abnormalities in input or malicious attacks, without causing unintentional harm.

  4. Fairness

    Instrumenting AI for fairness is essential. Properly calibrated, AI could assist humans in making more informed choices, process and evaluate facts faster and better, or allocate resources more fairly — allowing us to break the chain of human biases.

  5. Privacy

    AI systems must prioritize and safeguard consumers’ privacy and data rights and provide explicit assurances to users about how their personal data will be used and protected. Respect for privacy means full disclosure around what data is collected, how it will be used and stored, and who has access to it.

When people do not trust data, they start to fall back on their intuition and experience. Building a culture of trusting data can be challenging, but very important to ensure the future of the business. Another aspect that we consider essential to foster trust in data is empowerment. A data-driven culture and employee empowerment go hand in hand – while companies must provide the governance and tools to enable employees to act upon data, employees must be empowered to go ahead and make informed decisions. And always start with the outcomes in mind – knowing what these are up-front and ensuring everyone is on the same page is key.

Managing Data for AI

Data is an integral element of digital transformation for enterprises. But as organizations seek to leverage their data, they encounter challenges resulting from diverse data sources, types, structures, environments and platforms. Copying data, consolidating and moving it can affect its quality. Data silos typically complicate data integration, prevent centralized data management, and keep data from being easily accessible.

Data quality and integration can become major issues when pulling from multiple cloud environments. What is bringing companies back to a successful path for digital transformation is the employment of a new data architecture concept known as data fabric. With the new data fabric and AI capabilities, IBM is delivering what we anticipate should be a significant differentiator for customers by automating the data and AI lifecycle – the potential to free up time, money and resources – and connect the right data to the right people at the right time, while conserving resources. Get to know how data fabric differs from previous architectures, what it can achieve for businesses, and IBM’s role in implementing it in this white paper.

Remember, trusting in data is fundamental to achieving higher confidence in your predictions’ quality, developing deeper insights, unlocking discoveries, and making decisions exponentially faster. To hear more from our peers and learn tangible actions that you can customize and implement into your organization, watch the replay of CDO/CTO Summit “Building and Creating Trust in Data.”

Our next event in the IBM series will be on Wednesday October 20, 2021, where we’ll explore “Balancing Innovation and Growth with Risk and Regulation.” Find out more and register for the event.

 

 

The post Leading with trust will differentiate companies appeared first on Journey to AI Blog.



Supply chain planning in an xP&A world 2.0

Supply chains continue to be tested and transformed by the increasing globalization of the world economy, along with the massive amounts of insightful—but often disparate—data that comes with it. Demand volatility is on the rise, and given the pandemic’s ongoing uncertainty, it shows no signs of slowing down any time soon. Supply chain planning seeks to achieve and maintain an effectively lean supply equilibrium, one in which organizations store the necessary level of inventory on hand to meet the projected demand and reduce overhead and carrying costs.  Supply chain leaders know one thing is clear: inventory accuracy has never been more important, and they’ll need a truly comprehensive planning tool to get it right.

Finding the perfect balance that exists between sufficiency and surplus can prove especially tricky; however, extended planning and analysis (xP&A) solutions are making this easier.  At IBM, we take this a step further with “continuous integrated planning.” This enables planning to expand beyond the walls of finance and foster collaboration with the other functional teams to find the right supply and demand balance.

Supply and demand: the risk of getting it wrong

Inventory accuracy often dictates a company’s success. Overshooting projections can lead to obsolescence with excess inventory sitting in a warehouse, incurring costs and consuming valuable space that could be used for faster-moving inventory.  To mitigate these expenses, organizations resort to deeply discounting the slow-moving inventory.  In addition to the margin erosion associated with these activities, there are brand and market implications as well, such as lower consumer expectations and confidence around price and quality. This trend is extremely difficult to reverse provided you can even sell the extra inventory.

Excess inventory is only one part of the problem. Inventory shortages can also wreak havoc on a company’s bottom line. A large retail organization in the U.S. found out first-hand in December 2020 when they kept in-store inventory exceptionally lean with the expectation that their route to market would shift from in-person shopping to online shopping. Ultimately, this retailer disappointed customers who browsed empty shelves as a result of out-of-stock items, and lost repeat customer business.

Inventory forecasts are sensitive not only to internal data but also to external factors and environmental shifts. To plan with greater certainty, organizations need a planning solution that embeds predictive analytics and prescriptive analytics along with what-if scenario planning. They need a solution that integrates external factors such as weather data, market data and consumer buying patterns into the process, providing them the foresight to pivot quickly.

With prescriptive analytics, the operations team can match demand with current inventory levels at distribution locations to optimize placing the right products in the right locations at the right time. An extended planning & analysis solution provides full end-to-end visibility into both inventory and demand in real-time, thereby reducing the imbalance and providing a 360-degree view of the supply chain process.

Here’s how companies have achieved inventory accuracy with an integrated planning solution.

Allen Edmonds: Finding the perfect fit between inventory levels and customer demand 

When you buy a pair of shoes from Allen Edmonds, you expect a perfect fit. To keep customers coming back for more, it’s vital to stock the right styles and sizes in the right stores at the right time.

By transforming its planning processes with IBM Planning Analytics with Watson, Allen Edmonds gained insight into sales, regional preferences and more. Smarter decisions about which items to place in which stores helped the company boost sales, customer satisfaction and loyalty—even while reducing inventory levels.

Results: 10% lift in forecasting accuracy: results for one major event were within 3% of forecast

Pebble Beach: Creating the best shopping experience with IBM Planning Analytics with Watson 

Pebble Beach needed to satisfy shoppers with 15 unique stores and over 30,000 products.

Pebble Beach deployed IBM Planning Analytics with Watson to help its retail division analyze inventory levels, optimize purchasing, and make better use of merchandise. As a result, Pebble Beach keeps its stores fully stocked with the most desirable items, boosting sales and helping guests find the perfect memento of their visit.

In summary, we know that achieving lean supply equilibrium is difficult. Embracing extended planning using IBM Planning Analytics with Watson empowers organizations to not only overcome the supply vs. demand imbalance, but also to mitigate operating costs and ultimately enhance customer loyalty.

Learn more about IBM Planning Analytics with Watson 

 

 

The post Supply chain planning in an xP&A world 2.0 appeared first on Journey to AI Blog.



A new chapter in the IBM and Cloudera partnership

The amount of data collected by large enterprises is estimated to grow 10 times each year, and 90% of this data remains unused or underutilized. Managing these data sources across various siloes is time-consuming and costly. A lack of a cohesive governance strategy can lead to challenges in visibility, governance, portability and management that prevent enterprises from unlocking the business value of their data.

To help enterprises effectively manage their data needs, IBM entered a partnership with Cloudera almost a decade ago to expand our big data capabilities. In 2019, Cloudera merged with Hortonworks to pursue a hybrid cloud vision that further brought our companies together.

Today IBM is excited to announce a new chapter of our partnership with Cloudera that puts us in an even stronger position to help enterprises with their data and AI needs. We are strengthening our joint development and go-to-market programs to bring the advanced analytical capabilities of IBM Cloud Pak for Data, a unified platform for data and AI, to Cloudera Data Platform. This new offering will enable use cases in data science, machine learning, business intelligence, and real-time analytics directly on data within Cloudera Data Platform. The integration brings Cloudera under the IBM data fabric, a hybrid, multicloud data architecture that helps businesses access the right data just in time at the optimum cost, with end-to-end governance, regardless of where the data is stored.

Introducing Cloudera Data Platform for IBM Cloud Pak for Data

As the name suggests, this offering combines Cloudera’s best-in-class data lake with the advanced analytical capabilities of IBM Cloud Pak for Data. Cloudera Data Platform (CDP) for IBM Cloud Pak for Data provides one of the most complete multi-function platforms in the market. Now, businesses can run edge, streaming, data engineering, ETL, data warehousing, data visualization, and machine learning use cases with a single offering.

CDP for IBM Cloud Pak for Data provides a fast path to modernize data platforms in place  without performing a costly architectural reimplementation and migration.

CDP for IBM Cloud Pak for Data is hybrid and secure. It can run end-to-end anywhere with a full span of security and fine-grained enterprise-level governance that many other platforms can’t match. IBM’s state-of-the-art data fabric uses AI to automate complex data management tasks and universally discover, integrate, catalog, secure, and govern data across multiple environments.

Key features

  • Separation of storage and compute — CDP for IBM Cloud Pak for Data provides a data fabric with secure access to data anywhere it resides, from ingest to governance and data engineering, serving advanced analytics and high-performance BI all on one platform.
  • SQL analytics for all your data — By leveraging Big SQL as well as Hive and Impala, CDP for IBM Cloud Pak for Data provides warehouse-grade performance that exceeds the performance of alternatives in the market.
  • Run data science at scale — Use Watson Studio and CDP to build, run, and manage AI models to a petabyte scale.
  • Automated AI lifecycle management — CDP for IBM Cloud Pak for Data leverages the automation capabilities of IBM Watson Studio to speed up lifecycle of your critical data science projects.
  • Streamline data engineering — Take advantage of Cloudera Streaming Analytics, such as Flink, Apache Kafka, and SQL Stream Builder, and integrate it with IBM technologies like DataStage to achieve full breadth data engineering
  • Real-time reporting and BI — Data can be ingested in real-time with Flink and then displayed in IBM Cloud Pak for Data analytics dashboards.
  • Automated governance and cataloging — Data and associated metadata discovered are automatically catalogued, and assets are generated, removing the need for manual metadata/DDL generation
  • Open platform — Built on open systems and using non-proprietary data formats, the solution allows businesses to leverage data on any cloud.

In short, CDP for IBM Cloud Pak for Data:

  1. Enables data science at scale
  2. Provides a seamless single view of data with complete security and governance, without the need for data movement or replication
  3. Merges stream and batch data sets for analytics and real-time dashboards.

Together these benefits protect your existing technology investments in Hadoop while unlocking the business value of your data.

Next steps

To learn more about CDP for IBM Cloud Pak for Data, please visit our product page. You can also book a personal consultation there.

For more details, please visit IBM Cloud Pak for Data, IBM Data Fabric, and Cloudera Data Platform or join the Cloud Pak for Data Community.

The post A new chapter in the IBM and Cloudera partnership appeared first on Journey to AI Blog.



Reimagining ocean research the world’s first autonomous ship

Humans have been exploring the ocean for thousands of years, but now the power of AI can help unlock its mysteries more than ever. To commemorate the 400th anniversary of the Mayflower’s trans-Atlantic voyage, the Mayflower Autonomous Ship (MAS) will repeat the same journey—this time without any people onboard. The world’s first full-size autonomous ship will study uncharted regions of the ocean, and an AI Captain will be at the helm.

From Plymouth, England to Plymouth, Massachusetts, the crewless vessel will use explainable AI models to make accurate navigation decisions. The ship will collect live ocean data, delivering valuable research that can inform policies for climate change and marine conservation. Through IBM technologies, MAS makes all of this possible by advancing three areas vital to a successful mission: talent, trust, and data.

The future of the ocean is at stake

More than 3.5 billion people depend on the ocean as a primary source of food, and ocean-related travel makes up 90% of global trade. Since the 1980s, however, the ocean has absorbed 90% of the excess heat from global warming, endangering life both below and above the seas.

Protecting the ocean starts with understanding more data about its ecosystem, but this undertaking requires massive investment. MAS reduces the need for enormous resources in ocean research by using data and AI to augment human work (talent), navigate safely while meeting maritime regulations (trust), and fostering collaboration to develop actionable insights (data).

mayflower autonomous ship

Talent: Saving time and costs for scientists

A typical ocean research expedition can take six weeks with as many as 100 scientists onboard. Only one week is often spent on actual research. The rest of the time entails traveling to and back from destinations and sometimes managing bad weather and rough seas.

“Traditional research missions can be very expensive, limited in where they can explore and take a long time to collect data,” says IBM researcher Rosie Lickorish, who spent time on RSS James Cook as part of her Master’s in oceanography.

MAS significantly cuts down time and costs for scientists. A solar-powered vessel, it travels independently to collect data in remote and dangerous regions of the ocean. Researchers back on land can download live data and images synced to the cloud, such as whale songs or ocean chemistry detected by an “electronic tongue” called HyperTaste.

“With AI-powered sensors onboard that can analyze data as it’s collected, scientists can access more meaningful insights at greater speed,” says Lickorish. “The cost of data for our experts is low, in time as well as money.”

Trust: Navigating accurately with explainable AI

A combination of technologies helps MAS travel with precision: a vision and radar system scans the ocean and delivers data at the edge; an operational decision manager (ODM) enforces collision regulations; a decision optimization engine recommends next best actions; and a “watch dog” system detects and fixes problems.

This entire system makes the AI Captain intelligent, allowing it to make trusted navigational decisions driven by explainable AI. Rules-based decision logics in ODM validate and correct the AI Captain’s actions. A log tracks exactly which initial conditions were fed into ODM, which path it took through the decision forest, and which outcome was reached. This makes debugging and analyzing the AI Captain’s behaviors vastly easier than the “black box” AI systems that are common today.

Safety and compliance are key. For example, decision optimization through CPLEX on IBM Cloud Pak for Data, a unified data and AI platform, helps the ship decide what to do next. CPLEX considers constraints such as obstacles; their size, speed, and direction; weather; and how much power is left in batteries. It then suggests routes to ODM, which validates them or advises another course.

“ODM keeps the AI Captain honest and obeying the ‘rules of the road,’” says Andy Stanford-Clark, IBM Distinguished Engineer and IBM Technical Lead for MAS.

Data: Fostering collaboration for better insights

Once the mission is complete, researchers will use IBM Cloud Pak for Data to store data, apply governance rules to enhance data quality, manage user access and analyze data for actionable insights.

Having all data managed by a unified platform can enable greater collaboration for various project teams across ten countries. In addition, organizations and universities around the world can partner with the research teams, forming a grassroots coalition to advance measures that curb pollution and climate change.

Ready to be a Mayflower?

 MAS’s challenges—saving time and costs, making trustworthy predictions, and solving complex data problems—are not unique. Organizations in industries like banking, healthcare, transportation and more tackle these types of goals every day.

With the help of data and AI innovations from IBM Cloud Pak for Data, they might just become a “Mayflower” too. Learn more about the platform or schedule a consult.

The post Reimagining ocean research the world’s first autonomous ship appeared first on Journey to AI Blog.



IBM Named a 2021 Gartner Peer Insights Customers’ Choice for Analytics and Business Intelligence Platforms

At IBM we do everything with the success of our customers in mind. At IBM Cognos Analytics, we strive to become your AI co-pilot with an integrated, easy to manage, and extensible data and analytics platform, powered by IBM Cloud Pak. With Cognos, we enable our customers to be faster, smarter and look beyond what may just be visible, and do all that with confidence.  It is therefore with great excitement and appreciation that we announce that we have been recognized as a Customers’ Choice in the August 2021 Gartner Peer Insights ‘Voice of the Customer’: Analytics and Business Intelligence Platforms. 

Read the full report  

cognos analytics

We are thrilled that our customers share our vision and appreciate all those who have taken the time to share their successes. The Gartner Peer Insights Customers’ Choice is a recognition of vendors in this market by verified end-user professionals, taking into account both the number of reviews and the overall user ratings. To ensure fair evaluation, Gartner maintains rigorous criteria for recognizing vendors with a high customer satisfaction rate. Based on this valuable feedback, we saw three key trends emerge:

Cognos allows you to be…

Faster and Smarter

We have infused AI throughout the entire analytics lifecycle making data prep, data analysis, insight discovery and dashboard creation faster and smarter than ever before.

Here’s how one reviewer shared their appreciation:

“The enterprise approach of being able to give users what they want, when they want it, and how they want it. The full breath of services included with Cognos Analytics can service the who organization without add-ins or additional buy ins.” – Sr Financial Analyst, Energy and Utilities and link to full review

Operate with Confidence

As companies work to build up their data cultures, we need to ensure that customers feel empowered and confident in using their data. We continue to provide industry leading flexible governance to ensure that data is always secure, and insights are properly accessed by the right audiences. Additionally, our enterprise-grade pixel perfect reporting allows customers to confidently distribute data insights at scale, in multiple formats, with different filters and in any language.

“We are heavy users and have been very pleased the past few years on their vision for the product along with the speed they are delivering new functionality. We heavily rely on Cognos Analytics for secure/SOX controlled reporting environment.”  – Senior Director, Manufacturing and link to full review

Go further

Cognos Analytics facilitates greater collaboration between various personas and functional roles in end-to-end data projects by going beyond BI. For organizations mature in their AI lifecycle, we offer Cognos as a cartridge on Cloud Pak for Data, which allows customers to reap the benefits of container management, data virtualization, AI/ML, Planning and more on a consolidated platform.

“This tool is amazing in the sense that it uses machine learning, deep learning, cognitive intelligence capabilities which make its feature shine among other tools. Definitely, this tool has filled the gaps that were created by other popular BI tools.” – Consultant, Services and link to full review

At IBM Cognos Analytics people will always be at the heart of our products. When our users take the time to submit reviews and share feedback, they are molding our products and influencing our customer journey. So, to all our customers who submitted reviews, thank you!

Together, we will continue to elevate analytics and ensure we guide users to ask questions, explore possible futures, and feel empowered to link data, models, and insights to the business problems they are meant to solve and the decisions they are meant to inform. We are the co-pilot for businesses to be agile, resilient, and competitive for whatever tomorrow may bring.

If you haven’t yet, try Cognos Analytics for free today: Start the Trial

Please check out the following IBM Customer case studies to see how some of our most innovative and engaged customers are using Cognos to drive success in their organizations.

Elkjop: https://www.ibm.com/case-studies/elkjop/

Nukissiorfiit: https://www.ibm.com/case-studies/nukissiorfiit-cognos-analytics/

Ministry of Defence: https://www.ibm.com/case-studies/ministry-of-defence-services-cognos/

 

The GARTNER PEER INSIGHTS CUSTOMERS’ CHOICE badge is a trademark and service mark of Gartner, Inc., and/or its affiliates, and is used herein with permission. All rights reserved. Gartner Peer Insights Customers’ Choice constitute the subjective opinions of individual end-user reviews, ratings, and data applied against a documented methodology; they neither represent the views of, nor constitute an endorsement by, Gartner or its affiliates.

 Gartner, Gartner Peer Insights ‘Voice of the Customer’: Analytics and Business Intelligence Platforms, Peer Contributors, Aug 9, 2021

The post IBM Named a 2021 Gartner Peer Insights Customers’ Choice for Analytics and Business Intelligence Platforms appeared first on Journey to AI Blog.



Drive exponential growth in healthcare with trustworthy AI

In the wake of the coronavirus pandemic, healthcare organizations encountered unpredictable financial, operational, and total experience challenges. According to Forrester Research[1], 69% of healthcare data and analytics decision-makers report that the adoption of AI has had a positive impact on their organization. When 2019 and 2020 survey results were compared, the percentage of healthcare organizations implementing or have implemented AI grew from 55 to 63 percent. The companies with no interest or immediate plans to implement AI dropped from 21 to 18 percent[2]. For many in the sector, artificial intelligence is the solution for new, data-driven insights.

Learn what’s driving interest in AI, view the IBM Health Forum keynote:

Trustworthy AI: Driving exponential growth opportunities in the healthcare industry

Organizations across a wide swath of industries understand that AI is no longer a priority; it is mandatory for success. For those who invest in trustworthy AI, the benefits are clear. AI solutions propel operational efficiency, grow revenue, drive competitive advantage, and improve patient and employee experience outcomes. Despite the industry’s intentions and expectations, many organizations still search for the best practices, technology and ecosystem to reach their AI goals. The IBM Health Forum provided new insights and guideposts for those considering or advancing trustworthy AI within their organizations.

During the Health Forum keynote, industry experts from IBM, Forrester, Highmark Health and Innocens BV shared their experiences and learnings. Their use cases and insights illustrate the potential trustworthy AI delivers to organizations across the sector.

Inderpal Bhandari, IBM Chief Data Officer, kicked off the keynote by describing the role of trust across the AI continuum, specifically:

  • Trust in data: Designing a digital business from end-to-end having data domains, data governance, data lineage in mind.
  • Trust in AI models: Ensuring adequate risk management of AI models and business-focused governance to better foster collaboration at the crossroads between intelligent, exponential technologies and people.
  • Trust in processes and business models: Increasing business productivity by infusing AI based on trusted platform advantages.

Next, guest speaker Dr. Kjell Carlsson, Principal Analyst at Forrester Research, shared insights on the state of data and AI in the healthcare industry then transitioned to interviews with experts in healthcare data science and research supported by AI and machine learning.

Dr. Curren Katz, Sr. Director of Data Science Portfolio Management at Janssen Research & Development, described how she and the data science team at Highmark Health used AI models that reduced hospitalization rates for high-risk individuals who could be seriously affected by sepsis. The team repurposed its work for COVID-19.

Dr. David Van Laere, Neonatal Intensive Care Specialist and Founder of Innocens BV, shares how an AI, Edge Computing, and Internet of Things solution contributes to life-saving research. Deployed at Antwerp University Hospital in Belgium, the Innocens Project helps identify severe sepsis in very low birth weight infants enabling expedited treatment by healthcare teams.

View the full 20 minute video: Trustworthy AI: Driving exponential growth opportunities in the healthcare industry and learn why trustworthy AI solutions that are robust, transparent, explainable, fair and private are driving exponential growth in the healthcare sector.

Learn more about a trusted architecture for data and AI solutions — the foundation for driving new growth opportunities.

Visit: https://www.ibm.com/products/cloud-pak-for-data

[1]: 104 Data and analytics decision-makers whose firm is implementing or expanding use of AI in Healthcare, Pharmaceuticals and medical equipment, and Insurance, Forrester Analytics Business Technographics® Business and Technology Services Survey, 2020
[2]: 177, 169 data and analytics decision-makers in Healthcare, Pharmaceuticals and medical equipment, and Insurance, Forrester Analytics Business Technographics® Data and Analytics Survey, 2019 and 2020

The post Drive exponential growth in healthcare with trustworthy AI appeared first on Journey to AI Blog.



Supply chain planning in an xP&A world

The evolution of Financial Planning and Analytics (FP&A) is well underway. Today’s best planning solutions are more accurately described by Gartner’s new terminology “Extended Planning and Analytics (xP&A),” where business planning and forecasting is streamlined and integrated across every part of the organization. At IBM, we’ve embraced this mentality for years and are taking it further with the continuous integrated planning approach to business planning, which is currently being applied across all business units, from finance, to sales to supply chain and beyond.

Supply chain leaders know that true collaboration across business units is a baseline requirement for building accurate demand forecasts. To achieve this goal, they’ll need an upgrade from legacy planning tools and manual, spreadsheet-based processes. Gaining real-time visibility into disparate data is key to driving operational efficiency and actionable insights. But in today’s world, it’s not only about having internal data, or even historical data, at your disposal. To plan with greater certainty, especially during volatile times, organizations need to incorporate external data, such as weather data and employment trends, as well as scenario planning, into their planning process. Predictive analytics and scenario modeling are no longer nice-to-haves in planning but are necessary to provide the foresight organizations need to make critical business decisions. Nowhere has this requirement been more evident than in supply chain planning within the last 16 months.

With demand volatility on the rise and customer expectations continually evolving due to globalization, the risks of fragmented business processes and legacy operation management systems are too great to ignore. The correlation of external and client data allows organizations to gain deeper insights into what is driving demand, resulting in more accurate forecasts and subsequently more optimized levels of inventory.

Using one integrated planning solution also allows organizations to strategize and build plans with a synthesized view of the organization in mind.  An integrated approach allows operational plans to automatically update financial plans, providing business leaders a more accurate picture of organizational performance. This enables the identification of new growth opportunities, optimized spending and the ability to achieve business goals with confident momentum.

Here’s how three companies have been able to utilize supply chain planning across their organization to do just that.

Carhartt: How AI can help you capitalize on increased demand

The retailer Carhartt has been making rugged, durable work and weekend wear for more than 130 years, selling through a traditional wholesale model.  But when growth outshot expectations a few years ago, they experienced a replenishment problem and lost key revenue. By using AI and predictive analytics, Carhartt was able to build algorithms that take into account a huge range of factors—from economic indicators to weather, in order to generate demand forecasts and automate replenishment at a SKU level.

Read more

Novolex: How to make supply chain plans better-and faster during turbulent times

 Food packaging leader Novolex was struggling with lengthy annual planning cycles and the time it took to coordinate the company’s multiple locations and various systems. By switching to an integrated planning solution, they can now receive an accurate, timely view of what is happening at any given time along with clear prescriptive analytics, recommending the best course of action when facing dynamic demand.

“This year, we’ve been able to improve our inventory position by about 16%. That means the company is able to support the business with 16% less inventory than it maintained a year ago”

Violeta Nedelcu, Supply Chain Director, Novolex

Read more

Jabil: Track costs and maximize margins with real-time data

 Jabil engineers and builds a wide array of components and products for many Fortune 500 companies. With over 120 corporate clients in 10 sites, Regional Cost Accounting Manager, Martin Gonzalez, needed a truly integrated planning solution to gain transparency and agility. With IBM Planning Analytics with Watson, he created a margin analysis business intelligence dashboard that works at every level of the organization. Martin estimates that the dashboard lets his company create reports 25 times faster than the previous system: doing them manually in Excel.

“We’re saving each cost accountant on every site 7.5 hours a week. Now they can produce reports in two or three minutes.”

Martin Gonzalez, Regional Cost Accounting Manager, Jabil

Read more

In summary, while businesses have historically focused on financial planning and analysis, it’s clear that integrating with operational planning and extending across all functional departments allows leaders to stay better informed and drive stronger business results. Supply chain organizations can reap the rewards of xP&A and continuous integrated planning, with processes that are consistent, intuitively collaborative, and incorporate AI to provide both predictive insights and prescriptive guidance to business leaders. IBM Planning Analytics with Watson is the only partner that can provide a truly modern AI-powered, Extended Planning and Analysis (xP&A) solution that unifies, streamlines and scales planning across every part of the organization.

Get started today

 

The post Supply chain planning in an xP&A world appeared first on Journey to AI Blog.



Top