Heard on the Street – 1/10/2024

Heard on the Street – 1/10/2024
image

Welcome to insideBIGDATA’s “Heard on the Street” round-up column! In this regular feature, we highlight thought-leadership commentaries from members of the big data ecosystem. Each edition covers the trends of the day with compelling perspectives that can provide important insights to give you a competitive advantage in the marketplace. We invite submissions with a focus on our favored technology topics areas: big data, data science, machine learning, AI and deep learning. Click HERE to check out previous “Heard on the Street” round-ups.

How AI is Streamlining Video Editing. Commentary by Oren Kandel, CEO and Co-Founder of Munch

“In the rapidly evolving media landscape, where video is key for storytelling and engagement, a transformative shift is emerging. Traditional video editing, which is often a labor-intensive and costly process, is being revolutionized by AI.

AI is set to become an integral part of video production by streamlining the process from the outset. Whether it be efficiently organizing and categorizing media files to enhance the editing workflow before it begins or identifying and extracting content that aligns with current trends to enable teams to make data-driven decisions – AI has emerged as a powerful tool that can be leveraged in the video editing process from start to finish allowing content professionals or marketing teams to seamlessly create engaging content in a matter of minutes.

This shift promises a surge in both content volume and quality, achieved with less resources and time investment. AI is poised to become an indispensable tool for video editors, introducing a new era of creativity and efficiency in content creation. It is set to redefine the industry standards, unlocking innovative opportunities in the media landscape of the future and those leveraging these technologies will surely come out on top.”

The Power of AI-Powered Point-of-Care-Ultrasound. Commentary by Sandeep Akkaraju, CEO of Exo

With a worsening crisis of late-stage health problems resulting from an aging population, it’s critical that the US health system move away from reactive care to more preventative measures. More and more patients are using emergency rooms and urgent care centers as their first-care interactions contributing to significant wear on the US healthcare system, given that these facilities tend to heavily favor reactive healthcare needs rather than emphasizing preventative measures. Think of the healthcare system in the US as a “hub and spoke” model: at the center are hospitals and at the spoke ends are outpatient settings and clinics. To address the challenged healthcare system in the US, we need to do more to push more patient care interactions toward the spoke ends of the care continuum, emphasizing outpatient settings and clinics as the bulk of where patient care happens, deferring to central hospital visits for complex procedures and specialized testing. Load balancing the healthcare system in this way will drive down healthcare strain and save on overall care costs. 

One key answer to facilitating this movement is artificial intelligence (AI) integrated point-of-care ultrasound (POCUS). Today, obtaining medical answers from ultrasound isn’t a simple process. It requires both successful capture and intelligent interpretation of ultrasound images. This is difficult and requires skill and training.  POCUS today is operated by trained professionals, limiting where and how care is delivered. To boot, there is a dearth of sonographers in the US. This means patients must wait to receive a diagnosis—sometimes days or weeks. A hallmark of science is that it is quantifiable and reproducible.  But when it comes to ultrasound, there is ample data to show variability and lack of reproducibility in the capture and interpretation of images.  The quality of answers is determined by the user’s skill in performing the ultrasound scan – making accurate diagnosis highly operator-dependent. There’s a high variance in ultrasound exam quality resulting from varying sonographer skills, experience, and natural human inconsistencies.

Applying artificial intelligence (AI) to ultrasound image capture and interpretation introduces ease and standardization across this type of medical imaging, thereby making ultrasound scanning more accurate and reproducible – more scientific, in other words. Reducing operator dependence through AI helps ultrasound yield an answer every time, preventing the need for additional, unnecessary or specialist testing to gain that answer, which will save on other downstream testing and procedures. Faster answers will reduce specialist burden and allow patients to pass through the care continuum with greater efficiency and accuracy. As a result, AI-driven ultrasound contributes to the quadruple aim in healthcare: better patient outcomes, improved patient experience, greater provider well-being, and reduced costs.

Impact of ChatGPT and LLMs on the translation industry and beyond. Commentary by Olga Beregovaya, VP of AI and Machine Translation at Smartling

“As ChatGPT celebrates its birthday, we reflect on its transformative impact on technology and the translation industry. The evolution of Large Language Models, exemplified by the OpenAI GPT family, has been a dynamic journey marked by a nuanced trade-off. With additional training parameters and data, these models can perform a myriad of tasks, expanding their capabilities. 

In translation, GPT4 demonstrates improvements in achieving relevant results through fine-tuning, zero-shot and few-shot examples. However, challenges persist in addressing foreign language vocabulary, grammar and cultural nuances, highlighting the ongoing need for additional model training techniques and multilingual data. At Smartling, deploying successive GPT models has revealed that tasks like translation and language smoothing require specialized skills in prompt engineering and model fine-tuning, and prompt performance validation by native speakers of the target languages.

ChatGPT’s journey embodies the inherent challenges of pushing the boundaries of AI. 

The recent White House pledge involving major tech firms, including OpenAI, signifies a positive step toward addressing the risks associated with deploying AI-based applications and publishing AI-generated content. Allowing independent testing and commitment to data governance and transparency demonstrates a collective effort to instill confidence in the reliability and security of AI technologies.

While acknowledging the current uncertainties, the industry’s commitment to addressing trust, safety, and performance concerns ensures a foundation for continued growth and innovation in Large Language Models.”

Building a modern, inclusive data strategy starts with data connectivity. Commentary by Jerod Johnson, Senior Technology Evangelist at CData

“Data connectivity is essential for any modern company, enabling AI and machine learning technologies to draw insights from diverse and comprehensive data sets. Business and IT leaders can work together to ensure that an organization’s data is accessible to exactly the right stakeholders and from the tools and platforms they prefer. By connecting to every data system like CRMs, ERPs, HCMs, data warehouses, and more, organizations can build comprehensive views of their business, identify new opportunities, and optimize operations to drive innovation and growth.

A hallmark of modern, effective data connectivity is access to live data, continuously updating data, or both. As organizations modernize their data strategy, querying and analyzing live data allows for rapid decision-making and fosters an improved data culture. If every stakeholder can explore the data they want when they want, the organization will build data literacy and employees become more accustomed to leveraging data in their everyday processes.”

AI creates an enormous challenge to IT Ops and IT infrastructure. Commentary by JB Baker, Vice President of Marketing at ScaleFlux

“Integrating AI into IT operations and infrastructure will be a drain on budgets and power. As organizations integrate AI, they need to think about sufficient budget to manage the rest of the IT requirements. AI is not only costly but also consumes a considerable amount of power. As these demands increase, organizations must still maintain their existing services. This presents a dilemma: how can they meet this escalating demand for power and budget while managing their existing infrastructure and services?

This situation will likely drive a wave of innovation and efficiency reevaluation across all IT infrastructure areas. The focus will be optimizing power usage for service provision, reducing the cost of supporting these services, and enhancing equipment sustainability. As AI continues to consume an increasing share of power and capital expenditure budgets, achieving these efficiencies is critical.”

Importance of human-in-the-loop approach to AI. Commentary by Patrick Lin, SVP, GM, Observability at Splunk

“AI won’t replace manual troubleshooting — what’s more likely is it will augment the human who still needs to be in the loop. AI/ML will be faster at sifting through all the available data and recommending where to focus, and in what order, to get to problem identification.”

AI provides the most precise measurements of biology yet. Commentary by Hanjo Kim, SVP of global strategy and head of Medicinal chemistry at Standigm

“Integrating AI models with the physical world could lead to even more significant breakthroughs such as biocomputing that connects AI models with living biological systems including cells, tissues, and organisms. Scientists already collect information about these systems, including chemical analysis, image capture, and indirect signal measurement. Once researchers establish effective interfaces sensing biological signals directly and continuously between AI models and biological systems, they can capture live data seamlessly without bias, error, or context leakage. This will enable researchers to create time-based models that can be simulated to generate mode data and deepen our understanding of these systems.

Adapting CX Strategies Based on AI Evolution. Commentary by Matt Whitmer, Chief Revenue Officer and SVP of Marketing at Mosaicx

“As AI continues evolving into a central customer experience (CX) component, it’s reshaping how organizations across various sectors interact with customers and deliver services. A striking 55% of organizations have already embraced AI to enhance their business functions, and it’s paying off, with many seeing an uptick in revenue.

Now, let’s put this into perspective. AI, in its current state, is not unlike an ambitious apprentice, full of potential but still learning the ropes. Businesses are experimenting to see how AI can best serve their CX needs. The crux of effective AI implementation lies in truly understanding your customers. It’s not just about data and algorithms—it’s about gleaning insights from that data and using them to create memorable, personalized customer experiences. 

But let’s remember the value of the human touch to bring your brand to life. Integrating AI into CX doesn’t have to be about replacing human interactions, rather it can be about enhancing them. The goal is to find that balance where AI provides insights and efficiency while human empathy and understanding deliver warmth and a personal touch.

Looking ahead, it’s clear that AI will continue to shape the future of CX. As AI evolves, so must our strategies. The businesses that thrive over the long term know how to adapt, and AI, when put to work as a catalyst for innovative customer engagement, can help that adaptation.

The intersection of AI and CX represents an exciting (albeit complex) terrain. The road ahead is one of continuous adaptation, where the successful integration of AI in CX strategies will not just enhance customer satisfaction but also propel business growth in an environment that’s increasingly propelled by AI’s power.”

GenAI dev tools could stifle innovation. Commentary by Ori Keren, former VP of engineering at CloudLock & AT&T

“Just like we quickly forgot people’s phone numbers and navigation abilities when we became dependent on our iPhones, in the coming years, there’s a risk we will see people losing their sense of innovation as they become more dependent on Generative AI to help generate code. Generative AI automates code generation, but an innovation spark starts when humans write and collaborate on it, leading to new ideas. For the long term, the developer community needs to think of ways to preserve knowledge and encourage innovation in the coming years, especially in light of the rapid developments in this area.”

Importance of proper data preparation in the rapidly evolving age of AI. Commentary by Steven Hillion, SVP of Data and AI, Astronomer

“With the influx of AI models introduced to the data ecosystem in 2023, this past year generated a new wave of complexity within the data ecosystem. Recent research surveying a pool of data scientists from Anaconda highlighted just how much manual time is spent on data preparation, with respondents reporting that they spend more than a third (~38%) of their time on data preparation and cleansing. Data complexity is a time-consuming, repetitive issue that can be remediated with automation and the proper data tools. 

The new data provenance standards introduced by the Data & Trust Alliance reinforce the importance of proper data preparation in the rapidly evolving age of AI and empower data leaders to take control of their data lineage practices as soon as possible. Heading into 2024, data teams should prioritize establishing clear data lineage practices to improve all aspects of the data ecosystem, including data governance, data quality, and data operations. Datasets and AI model use cases will only continue to grow in the coming years, so ensuring a firm data foundation today is critical to securing the most accurate data insights for your organization.”

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: https://twitter.com/InsideBigData1

Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/

Join us on Facebook: https://www.facebook.com/insideBIGDATANOW