Monthly Archives: May 2021

Video Highlights: Challenges of Operationalizing ML

In the panel discussion below, the focus is on the main challenges of building and deploying ML applications. The discussion includes common pitfalls, development best practices, and the latest trends in tooling to effectively operationalize. The presentation comes from apply(): The ML Data Engineering Conference sponsored by Tecton.



Why 3D is the Key to Unlocking Vital Video Surveillance Data

In this contributed article, Srinath Kalluri, CEO of Oyla, suggests that by blending AI, 3D and video analytics, businesses will be able to bring better information to big data and enable smarter and safer ways of working.



APIs: The Real ML Pipeline Everyone Should Be Talking About

In this special guest feature, Rob Dickinson, CTO, Resurface Labs, suggests that to achieve greater success with AI/ML models, through accurate business understanding, clear data understanding, and high data quality, today’s API-first organizations must shift towards real-time data collection.



Non-negotiables for SaaS solutions

In this contributed article, Jon Siegler, Co-Founder and Chief Product Officer at LogicGate, discusses the future of the graph database, its impact on SaaS solutions, and his experience architecting products built on the flexible data model.



You could be paying less for software licensing

High licensing and maintenance fees Gartner, Inc. defines the TCO for enterprise software as the total cost an organization incurs to use and maintain software technology over time. To calculate TCO, companies consider direct costs, such as hardware, software and administration, and indirect costs, including human resources, project management and downtime. But there’s one key cost that is often overlooked or underestimated: ongoing licensing and maintenance fees. Why does licensing get neglected? Perhaps it’s because some vendors have made their licensing requirements and fee structures so complex that they’re hard to understand. Maybe it’s related to license term lengths: after three or four years, fees are not as top of mind as they were at purchase time. Or, sadly, perhaps it’s because people are simply used to the pain of paying high licensing fees and have accepted them as the cost of doing business. It’s time to stop the pain.

Take a stand against high fees

Consider the case of Siteco GmbH, a lighting solutions and technology innovator that’s been illuminating the streets, cities, industries and stadiums of Germany and beyond for 150 years. Until recently, it was part of a much larger organization, a global technology company. Both organizations relied on Oracle database technology to support their SAP applications and capitalized on the size of their enterprise to negotiate licensing fees with Oracle.

But that changed in 2018 when Siteco was sold to Stern Stewart Capital. Siteco moved its IT to a data center environment. It also lost some of the benefits of being part of a large organization, such as enhanced negotiating power with Oracle to reduce its licensing and maintenance fees.

For Siteco to succeed independently and paying high Oracle licensing and maintenance fees wasn’t an option. The company sought a way out.

Seeing the light and reaping ROI

Siteco adopted an IBM Db2 Database in early 2020. ROI started on Day 1.

Db2 contributed to a 7-digit reduction in TCO and sped ROI in several ways. For starters, Db2 gave Siteco some breathing room and flexibility when it came to licensing. Consider this: Oracle bases its licensing fees on the number of processor cores a customer uses. If customers want to improve performance, they have to add more cores, so performance can come at a hefty price.

By contrast, Db2 offers subcapacity licensing in a virtualized environment. For instance, if you have a 100-core server but Db2 uses just one core, you only pay for that one core, not all 100. With Oracle, unless you pin a virtual machine to a specific number of cores in a 28-core server, for example, you can be charged a license fee for all 28 cores.

Using Db2, Siteco reduced its number of cores by 50%. How? With better compression. The company had already compressed its Oracle databases, but Db2 compressed them even more. Db2 uses deep compression technology to minimize storage space and improve database performance. Ultimately, the company reduced storage by 54%, which, in turn, reduced backup and storage costs.

And speaking of performance, Siteco runs its SAP batch jobs 27% faster and its web services jobs 30% faster than before.

With Db2, there are no hardware limitations, so you’re free to build additional systems without additional cost. You can deploy the systems wherever you want, be it on premises, cloud or mixed environments. You also have the option of containerizing the Db2 workloads and taking advantage of IBM Cloud Pak for Data by using the Db2 extension for IBM Cloud Pak for Data. With IBM Db2 on IBM Cloud Pak for Data, you benefit from a fully integrated data and AI platform that modernizes the data management.

Outstanding results are contagious

The cost savings, performance gains and ROI realized by Siteco haven’t gone unnoticed — Other executives are now exploring Db2 too. Who can blame them?

Read about how Db2 Database is slashing TCO and boosting performance for other companies, such as Owens-Illinois, Inc. and Audi AG.

What can Db2 Database do for your business? To find out, book a consultation or talk to an IBM representative or IBM Business Partner. You can also read an analyst’s in-depth cost/benefit comparison between Db2 and Oracle.

The post You could be paying less for software licensing appeared first on Journey to AI Blog.



Understanding “Human Intent and Behavior” with Computer Vision

In this contributed article, editorial consultant Jelani Harper discusses how computer vision is one of the most eminent forms of statistical Artificial Intelligence in use today. Comprised of varying facets of object detection, facial recognition, image classification, and other techniques, it supports a range of pressing use cases from contact-less shopping to video surveillance.



How can you make modernizing your data and AI architecture simpler?

IT architectures have witnessed an increasing amount of dispersal and segmentation over the last decade of their life cycle as new data and new technology have made their impact. Thus, many are seeking to modernize and optimize their current data and artificial intelligence (AI) architecture in an effort to address the lack of cohesion and widespread data repositories as well as data and analytics business needs. Of course, doing so can also be difficult without the right planning and experience which is leading to abandoning these modernization efforts or not starting them at all rather than high-performance ecosystems with automation intended to drive better AI models and customer experiences. Fortunately, modernizing no longer needs to be that difficult. IBM has not only developed IBM Cloud Pak for Data, a data and AI platform, but a Modernization Factory experience to deliver the planning and expertise crucial to modernization success.

Greater flexibility and integration from data management to machine learning

First, let’s take a look at the concept of a data and AI platform like IBM Cloud Pak for Data, which is a truly hybrid platform that can be deployed anywhere, on any cloud. Current customers of standalone IBM Db2, IBM DataStage or external products like them may question the need to move to a cloud platform if current systems are operating well. While understandable, it misses the potential to be more efficient and effective as industries race toward even greater usage of machine learning and artificial intelligence. The benefit of a platform is that data management, data governance or DevOps, data science, machine learning, artificial intelligence and a host of other tools – even open source – work in concert together to produce better results for decision making than any could alone.

A key example is the data fabric, which connects multiple data sources and modern data sets through data virtualization, enabling them to be accessed and governed at a single point for enhanced self-service data access by data scientists and business users. This is true whether it happens to be big data in a data lake, real-time streaming IoT data, or more traditional data in a database or data warehouse. It’s data integration without data movement but with DataOps embedded directly at the source. Moreover, the flexible licensing options and quick provisioning help future proof the architecture for a variety of machine learning and artificial intelligence use cases, allowing new opportunities to be seized rapidly with the quick addition of new capabilities. IBM Cloud Pak for Data also recognizes the inherent complexity of the deployment environment and is built to make deployment in a hybrid, multi-cloud, multi-vendor environment as easy as possible. This is accomplished by running on Red Hat OpenShift.

The advice and tools you need to make modernization and optimization easy

Even when the decision to modernize with a data and AI platform has been made, the thought of actually undertaking a digital transformation can be concerning. Making the move with as little disruption as possible without losing anything in the process is a key concern. In recognition of these concerns, IBM Cloud & Cognitive Expert Labs have designed the Modernization Factory experience to help ensure things go smoothly.

As part of Modernization Factory, IBM experts will work with the customer to better understand their current landscape, use case, functionality and the business opportunity they’re trying to realize. They’ll also engage in technical discovery, which includes building an inventory, assessment and roadmap. Various mobilization and planning exercises are also used. From there, the solution is installed and provisioned, a workload test is run, and any remaining adoption and implementation activities are completed.

Whether you’re moving from an on-premises implementation to Cloud Pak for Data or Cloud Pak for Data as a service, Db2, DataStage and IBM Cognos benefit immensely from the Modernization Factory process with benefits like:

Db2

  • The ability to containerize databases in minutes
  • No exposure of raw data
  • Maintaining full database integrity

DataStage

  • Automated assessment / unit-testing
  • Automated modernization of workload
  • Automated conversion of enterprise stages to CPD connectors

Cognos

  • Automated conversion of security setting to CPD native security
  • Automated modernization of workload

How to start modernizing

There has never been a better time to modernize your architecture with a data and AI platform. The era of AI demands a flexible interconnected data architecture and the experts at IBM are ready to help you through the process. To learn more about the importance of upgrading, read our white paper Upgrade to agility: The value of modernizing data and AI services to IBM Cloud Pak for Data.

The post How can you make modernizing your data and AI architecture simpler? appeared first on Journey to AI Blog.



Making Data Simple: What does Legacy Powers Legendary mean?

The post Making Data Simple: What does Legacy Powers Legendary mean? appeared first on Journey to AI Blog.



Quantum Machine Learning – An Introduction to QGANs

In this contributed article, data scientists from Sigmoid discuss quantum machine learning and provide an introduction to QGANs. Quantum GANs which use a quantum generator or discriminator or both is an algorithm of similar architecture developed to run on Quantum systems. The quantum advantage of various algorithms is impeded by the assumption that data can be loaded to quantum states. However this can be achieved for specific but not generic data.



Improving Customer Experience through Interaction Analytics

In this article, we’ll explain how AI-driven “interaction analytics” represents a new and transformational technology that enables enterprise stake holders to turn customer interactions into a competitive advantage.



Top