AI in Healthcare: Transforming Patient Care and Clinical Operations

Artificial Intelligence (AI) is redefining healthcare at every level—speeding and improving patient care, enhancing diagnostic accuracy, personalizing treatments, speeding drug delivery, and streamlining hospital operations. As global health systems grapple with rising costs, staff shortages, and complex patient needs, AI has emerged as a critical force for innovation. From radiology to drug discovery, AI is no longer experimental; it is becoming a cornerstone of modern medicine.

AI in healthcare is expected to exceed a whopping $600B by 2034, according to one forecast. The explosion of digital healthcare records has created vast opportunities for AI to uncover and use patterns and insights across the spectrum of healthcare.

How AI Is Being Used in Healthcare

AI’s applications in healthcare are vast and growing. Here are a few examples:

  • Medical Imaging and Diagnostics: AI algorithms can analyze X-rays, MRIs, and other medical images to detect diseases, such as cancer and heart conditions.
  • Virtual Health Assistants: As in every industry, AI chatbots and symptom checkers (e.g., Ada Health, Babylon Health) provide patients with 24/7 support proving information and answers to questions, triage guidance, medication refills, and appointment scheduling.
  • Predictive Analytics: Hospitals use AI to forecast patient admissions, identify those at risk for readmission, and anticipate Emergency / ICU demand. AI can predict the risk of developing certain diseases based on patient data, such as medical history, lifestyle, and genetic information.
  • Personalized Medicine: AI will help to develop with better precision personalized treatment plans and medications/supplements for patients.
  • General Research & Development: AI is being used in medical research to analyze massive datasets in order to develop new treatments and improve patient care and outcomes.
  • Drug Discovery and Development: AI models from companies like BenevolentAI and Insilico Medicine accelerate the identification of potential compounds, reducing the timeline from years to months.
  • Robotic Surgery: AI-powered robotic systems can assist surgeons with complex procedures, improving precision and minimizing invasiveness. Systems like the da Vinci Surgical System use AI-assisted robotics to improve surgical precision and minimize recovery times.
  • Administrative Automation: AI streamlines billing, coding, and claims management to reduce paperwork and human error. AI-powered systems can also automate appointment scheduling and reminders. This will help with staff shortages and reduce staff burnout.
  • Billing, coding and health insurance processing: AI can analyze medical records and automate insurance processing and billing, thus improving efficiency, reducing errors, and reducing staff hours.
  • Fraud detection: AI algorithms can identify suspicious patterns in healthcare claims, leading to reducing in fraud.
  • Pregnancy Management: AI applications are used to monitor the health of both mother and fetus through wearable devices and monitored data.

And the healthcare use cases go on and on.

Tools, Technologies, and Methods Behind AI in Healthcare

Healthcare AI encompasses a wide variety of solutions and, therefore, draws on a mix of advanced technologies and methods:

  • Machine Learning (ML) for patient risk stratification, disease progression modeling, and outcome prediction.
  • Natural Language Processing (NLP) to analyze unstructured clinical notes, extract insights from medical records, and power conversational interfaces.
    • for example: Healthcare Text Analysis: In the healthcare sector, Azure’s language solutions are used to extract clinical information from unstructured medical documents. Features like entity recognition and text analytics for health help identify symptoms, medications, and diagnoses, supporting faster and more accurate decision-making.
  • Computer Vision in radiology, pathology, and dermatology for image-based diagnostics.
  • Robotics for surgeries, rehabilitation support, and hospital logistics (e.g., delivery robots for medications).
  • Cloud AI Platforms such as Microsoft Azure Health Data Services, AWS HealthLake, and Google Cloud Healthcare API for data integration and analysis.
  • Generative AI for drug molecule design, synthetic medical data creation, and personalized patient communication.
  • Google’s DeepMind and IBM Watson Health assist radiologists in detecting conditions such as cancer, stroke, and heart disease earlier and with higher precision.
  • IDx-DR, an FDA-approved AI tool, diagnoses diabetic retinopathy by analyzing eye images, helping to prevent irreversible damage through early detection.
  • AI-powered drug discovery tool, Atomwise, has successfully identified potential treatments for diseases like Ebola within a day.
  • Aidoc, an AI-driven radiology platform, prioritizes critical cases and detects abnormalities in medical images, significantly enhancing diagnosis and treatment.
  • Dragon Medical One uses speech recognition and speech-to-text features to assist healthcare providers with documenting patient notes, leading to time savings and better accuracy.

Benefits of AI in Healthcare

The adoption of AI has brought measurable benefits across the sector:

  • Improved Diagnostics: AI-powered imaging tools have demonstrated higher sensitivity in detecting certain conditions than human experts alone.
  • Personalized Care: AI helps tailor treatment plans to individual patient profiles, improving outcomes.
  • Operational Efficiency: Predictive analytics optimize staffing, reduce wait times, and cut costs.
  • Faster Drug Development: AI-driven discovery platforms shorten the drug development cycle, lowering costs and improving pipeline success.
  • Accessibility: Virtual assistants extend care access to underserved populations by providing round-the-clock guidance.

Pitfalls and Challenges of AI in Healthcare

Despite the proven results and huge promise, several challenges and risks persist:

  • Data Quality: While significant amounts of data are available, not all of it is of high quality, and significant efforts are needed to ensure that the data driving AI solutions is clean and accurate.
  • Data Bias and Inequality: AI models trained on non-diverse datasets may produce biased outcomes, particularly for underrepresented populations.
  • Regulatory Hurdles: The FDA and other agencies require rigorous testing and approval, slowing AI adoption.
  • Failed Projects: IBM Watson Health, once hyped as a revolutionary tool, failed to meet expectations in oncology due to overpromising and underdelivering.
  • Data Privacy Concerns: With vast amounts of sensitive data, breaches or misuse pose serious risks.
  • Integration Challenges: Many hospitals face difficulties embedding AI into legacy systems and workflows.
  • High AI Costs: AI solutions are rarely cheap, and not all healthcare companies can afford what they desire. Companies need to carefully and strategically choose which solutions to implement.
  • Overreliance on AI: Excessive trust in algorithms could lead to errors if not combined with human oversight.

The Future of AI in Healthcare

The trajectory of AI points toward deeper integration into healthcare delivery:

  • Precision Medicine at Scale: AI will increasingly guide genomics-driven treatments, tailoring care to a patient’s DNA profile.
  • Real-Time Monitoring: Wearables and IoT devices paired with AI will continuously track patient health and alert clinicians to early warning signs.
  • Generative AI in Research: AI will help simulate clinical trials and accelerate hypothesis generation.
  • Holistic Care Platforms: AI-powered systems will unify patient data from hospitals, clinics, and home devices into seamless health records.
  • Ethical AI Frameworks: Future AI systems will be built with fairness, accountability, and transparency at their core.

How Healthcare Organizations Can Gain an Advantage

To stay competitive and maximize AI’s potential, healthcare providers and companies should:

  1. Invest in High-Quality Data: Ensure datasets are diverse, representative, and securely stored.
  2. Adopt AI Incrementally: Start with specific use cases—such as imaging, scheduling, or claims processing—before scaling enterprise-wide.
  3. Prioritize Human-AI Collaboration: Position AI as a support tool to augment, not replace, clinicians.
  4. Strengthen Compliance and Ethics: Build governance frameworks around data privacy, bias mitigation, and transparency.
  5. Train and Upskill Staff: Equip medical professionals and administrators with the skills to effectively use AI.
  6. Foster Partnerships: Collaborate with AI startups, academic research labs, and technology providers for faster innovation.

Conclusion

AI in healthcare represents both extraordinary promise and complex challenges. It is already improving patient outcomes, optimizing hospital operations, and reducing the time and cost of drug development. Yet, for every breakthrough, there are lessons in bias, regulation, and integration that remind us AI is not a silver bullet. However, its adoption and success rates in healthcare, and across the board, is expected to grow significantly. Not using AI is not an option. The future belongs to healthcare organizations that use AI responsibly and effectively—balancing innovation with ethics, automation with compassion, and efficiency with equity.

This article is a part of an “AI in …” series that shares information about AI in various industries and business functions. Be on the lookout for future (and past) articles in the series.

Thanks for reading and good luck on your data (AI) journey!

Other “AI in …” articles in the series:

AI in the Hospitality Industry: Transforming Guest Experiences and Operations

AI in Gaming: How Artificial Intelligence is Powering Game Production and Player Experience

Developing metrics for your analytics project

When starting an analytics project, one of the most important decisions you will make is identifying the right metrics. Metrics serve as the compass for the initiative—they show whether you are on the right track, communicate achievements, highlight challenges, uncover blind spots, and ultimately, along with guiding future decisions, they demonstrate the value of the project to stakeholders. But designing metrics is not as simple as picking a single “success number.” To truly guide decision-making, you need a holistic set of measures that reflect multiple dimensions of performance.

Why a Holistic View Matters

Analytics projects sometimes fall into the trap of focusing on only one type of metric. For example, a project might track quantity (e.g., number of leads generated) while ignoring quality (e.g., lead conversion rate). Or it may measure cost savings but fail to consider user satisfaction, leading to short-term wins but long-term disengagement.

Develop Metrics from Multiple Dimensions

To avoid this pitfall, it’s critical to develop a balanced framework that includes multiple perspectives:

  • Quantity: How much output is produced? Examples include number of units produced, sales revenue, or number of new customers added.
  • Quality: What is the quality of the output? Examples include accuracy rates, defect counts, or error percentages.
  • Time: How long does it take to achieve the output? Or in other words, what timeframe is the quantity and quality measured over? Is it Sales revenue per hour, per day, per month, or per year?
  • Costs: What resources are being consumed? Metrics might include infrastructure costs, labor hours and costs, materials costs, or overall project spend.
  • Satisfaction: How do stakeholders, customers, or employees feel about the results? Feedback surveys, adoption rates, product ratings, and net promoter scores (NPS) are common ways of identifying this information.

Each of these perspectives contributes to the full story of your analytics project. If one dimension is missing, you risk optimizing for one outcome at the expense of another.

Efficiency, Effectiveness, and Impact Metrics

Another way you can classify your metrics to achieve a holistic view is with three overarching categories: Efficiency, Effectiveness, and Impact.

  • Efficiency Metrics
    • These measure how well resources are used and answers “are we doing things right?“. They focus on inputs versus outputs.
      • Example: “Average work hours per product” shows how quickly work gets done.
      • Example: “Cost per customer acquired” reflects the efficiency of your sales operations.
    • Efficiency metrics often tie directly to quantity, cost, and time.
  • Effectiveness Metrics
    • These measure how well goals are achieved—whether the project delivers the intended results, and answers “are we doing the right things?“.
      • Example: “Customer satisfaction” demonstrates how happy customers are with our products and services.
      • Example: “Actual to Target” shows how things are tracking compared to the goals that were set.
    • Effectiveness metrics often involve quality, satisfaction, and time.
  • Impact Metrics
    • These measure the broader business or organizational outcomes influenced by some activity.
      • Example: “Market share and revenue growth” shows financial state from a broader market and overall standpoint.
      • Example: “Return on Investment (ROI)” is the ultimate metrics for financial performance.
    • Impact metrics communicates how we are doing with our long-term, strategic goals. They often combine quantity, quality, satisfaction, and time dimensions.

The Significance of the Time Dimension

Among all the dimensions used in metrics, time is especially powerful because it adds critical context to nearly every metric. Without time, numbers can be misleading. Just about all metrics are more relevant when the time component is added. Time transforms static measures into dynamic insights. For instance:

  • A quantity metric of “100 new customers” becomes far more meaningful when paired with “this month” versus “since company founding.”
  • A quality metric of “95% data accuracy” is less impressive if it takes weeks to achieve, compared to real-time cleansing.
  • A cost metric of “$100,000 project spend” raises different questions depending on whether it’s a one-time investment or a recurring monthly expense.

By always asking, “Over what time frame?”, you unlock a truer understanding of performance. In short, the time dimension transforms static measures into dynamic insights. It allows you to answer not just “What happened?” but also “When did it happen?”, “How long did it take?”, and “How is it changing over time?”—questions that are generally crucial for actionable decision-making.

Time adds context to every other metric. Think of it as the axis that brings your measures to life. Quantity without time tells you how much, but not how fast. Quality without time shows accuracy, but not whether results are timely enough to act upon. Costs without time hide the pace at which expenses accumulate. And satisfaction without time misses whether perceptions improve, decline, or stay consistent over an initiative’s lifecycle.

The Significance of the Timeliness

Another important consideration is timeliness. Metrics must be accessible to decision makers in a timely manner to allow them to make timely decisions. For example:

  • A metric may deliver accurate insights, but if it takes three weeks to refresh the data and the dashboard that displays it, the value erodes.
  • A machine learning model may predict outcomes with high accuracy, but if the scoring process delays operational decisions, the benefit diminishes.

Therefore, in addition to deciding on and building the metrics for a project, the delivery mechanism of the metrics (such as a dashboard) must also be thought out to ensure that the entire process, from data sourcing to aggregations to dashboard refresh for example, can all happen in a timely manner to, in turn, make the metrics available to users in a timely manner.

Putting It All Together

When developing metrics for your analytics project, take a step back and ensure you have a comprehensive, multi-angle approach, by asking:

  • Do we know how much is being achieved/produced (quantity)?
  • Do we know how well it is being achieved/produced (quality)?
  • Do we know how fast results are being delivered (time)?
  • Do we know how much it costs to achieve (costs)?
  • Do we know how it feels to those affected (satisfaction)?
  • Do we know whether we are efficiently using resources?
  • Do we know whether we are effective in reaching goals?
  • Do we know what impact this work is having on the organization?
  • And for the above questions, always get a perspective on time … when? over what timeframe?
  • When are updates to the metrics needed by (real-time, hourly, daily, weekly, monthly, etc.)?

By building metrics across these dimensions, you create a more reliable, meaningful, and balanced framework for measuring success. More importantly, you ensure that the analytics project supports not only the immediate technical objectives but also the broader organizational goals.

Thanks for reading! Good luck on your analytics journey!

AI in Gaming: How Artificial Intelligence is Powering Game Production and Player Experience

The gaming industry isn’t just about fun and entertainment – it’s one of the largest and fastest-growing industries in the world. Valued at over $250 billion in 2024, it’s expected to surge past $300 billion by 2030. And at the center of this explosive growth? Artificial Intelligence (AI). From streamlining game development to building creative assets faster to shaping immersive and personalized player experiences, AI is transforming how games are built and how they are played. Let’s explore how.

1. AI in Gaming Today

AI is showing up both behind the scenes (in development studios and in technology devices) and inside the games themselves.

  • AI Agents & Workflow Tools: A recent survey found that 87% of game developers already incorporate AI agents into development workflows, using them for tasks such as playtesting, balancing, localization, and code generation PC GamerReuters. For bug detection, Ubisoft developed Commit Assistant, an AI tool that analyzes millions of lines of past code and bug fixes to predict where new errors are likely to appear. This has cut down debugging time and improved code quality, helping teams focus more on creative development rather than repetitive QA.
  • Content & Narrative: Over one-third of developers utilize AI for creative tasks like dynamic level design, animation, dialogue writing, and experimenting with gameplay or story concepts PC Gamer. Games like Minecraft and No Man’s Sky use AI to dynamically create worlds, keeping the player experience fresh.
  • Rapid Concept Ideation: Concept artists use AI to generate dozens of initial style options—then pick a few to polish with humans. Way faster than hand-sketching everything Reddit.
  • AI-Powered Game Creation: Roblox recently announced generative AI tools that let creators use natural language prompts to generate code and 3D assets for their games. This lowers the barrier for new developers and speeds up content creation for the platform’s massive creator community.
  • Generative AI in Games: On Steam, roughly 20% of games released in 2025 use generative AI—up 681% year-on-year—and 7% of the entire library now discloses usage of GenAI assets like art, audio, and text Tom’s Hardware.
  • Immersive NPCs: Studios like Jam & Tea, Ubisoft, and Nvidia are deploying AI for more dynamic, responsive NPCs that adapt in real time—creating more immersive interactions AP News. These smarter, more adaptive NPCs react more realistically to player actions.
  • AI-Driven Tools from Tech Giants: Microsoft’s Muse model generates gameplay based on player interaction; Activision sim titles in Call of Duty reportedly use AI-generated content The Verge.
  • Playtesting Reinvented: Brands like Razer now embed AI into playtesting: gamers can test pre-alpha builds, and AI tools analyze gameplay to help QA teams—claiming up to 80% reduction in playtesting cost Tom’s Guide. EA has been investing heavily in AI-driven automated game testing, where bots simulate thousands of gameplay scenarios. This reduces reliance on human testers for repetitive tasks and helps identify balance issues and bugs much faster.
  • Personalized Player Engagement: Platforms like Tencent, the largest gaming company in the world, and Zynga leverage AI to predict player behavior and keep them engaged with tailored quests, events, offers, and challenges. This increases retention while also driving monetization.
  • AI Upscaling and Realism
    While not a game producer, NVIDIA’s DLSS (Deep Learning Super Sampling) has transformed how games are rendered. By using AI to upscale graphics in real time, it delivers high-quality visuals at faster frame rates—giving players a smoother, more immersive experience.
  • Responsible AI for Fair Play and Safety: Microsoft is using AI to detect toxic behavior and cheating across Xbox Live. Its AI models can flag harassment or unfair play patterns, keeping the gaming ecosystem healthier for both casual and competitive gamers.

2. Tools, Technologies, and Platforms

Let’s take a look at things from the technology type standpoint. As you may expect, the gaming industry uses several AI technologies:

  • AI Algorithms: AI algorithms dynamically produce game content—levels, dialogue, music—based on developer input, on the fly. This boosts replayability and reduces production time Wikipedia. And tools like DeepMotion’s animation generator and IBM Watson integrations are already helping studios prototype faster and more creatively Market.us
  • Asset Generation Tools: Indie studios like Krafton are exploring AI to convert 2D images into 3D models, powering character and world creation with minimal manual sculptingReddit.
  • AI Agents: AI agents run thousands of tests, spot glitches, analyze frame drops, and flag issues—helping devs ship cleaner builds fasterReelmindVerified Market Reports. This type of AI-powered testing reduces bug detection time by up to 50%, accelerates quality assurance, and simulates gameplay scenarios on a massive scale Gitnux+1.
  • Machine Learning Models: AI tools, typically ML models, analyze player behavior to optimize monetization, reduce churn, tailor offers, balance economies, anticipate player engagement and even adjust difficulty dynamically – figures range from 56% of studios using analytics, to 77% for player engagement, and 63% using AI for economy and balance modeling Gitnux+1.
  • Natural Language Processing (NLP): NLPs are used to power conversational NPCs or AI-driven storytelling. Platforms like Roblox’s Cube 3D and Ubisoft’s experimenting with AI to generate dialogue and 3D assets—making NPCs more believable and story elements more dynamic Wikipedia.
  • Generative AI: Platforms like Roblox are enabling creators to generate code and 3D assets from text prompts, lowering barriers to entry. AI tools now support voice synthesis, environmental effects, and music generation—boosting realism and reducing production costs GitnuxZipDoWifiTalents
  • Computer Vision: Used in quality assurance and automated gameplay testing, especially at studios like Electronic Arts (EA).
  • AI-Enhanced Graphics: NVIDIA’s DLSS uses AI upscaling to deliver realistic graphics without slowing down performance.
  • GitHub Copilot for Code: Devs increasingly rely on tools like Copilot to speed coding. AI helps write repetitive code, refactor, or even spark new logic ideas Reddit.
  • Project Scoping Tools: AI tools can forecast delays and resource bottlenecks. Platforms like Tara AI use machine learning to forecast engineering tasks, timelines, and resources—helping game teams plan smarter Wikipedia. Also, by analyzing code commits and communication patterns, AI can flag when teams are drifting off track. This “AI project manager” approach is still in its early days, but it’s showing promise.

3. Benefits and Advantages

Companies adopting AI are seeing significant advantages:

  • Efficiency Gains & Cost Savings: AI reduces development time significantly—some estimates include 30–50% faster content creation or bug testing WifiTalents+1Gitnux. Ubisoft’s Commit Assistant reduces debugging time by predicting where code errors may occur.
  • Rapid Concept Ideation: Concept artists use AI to generate dozens of initial style options—then pick a few to polish with humans. Way faster than hand-sketching everything Reddit.
  • Creative Enhancement: Developers can shift time from repetitive tasks to innovation—allowing deeper storytelling and workflows PC GamerReddit.
  • Faster Testing Cycles: Automated QA, asset generation, and playtesting can slash both time and costs (some developers report half the animation workload gone) PatentPCVerified Market Reports. For example, EA’s automated bots simulate thousands of gameplay scenarios, accelerating testing.
  • Increased Player Engagement & Retention: AI keeps things fresh and fun with AI-driven adaptive difficulty, procedural content, and responsive NPCs boost immersion and retention—users report enhanced realism and engagement by 35–45% Gitnux+2Gitnux+2. Zynga uses AI to identify at-risk players and intervene with tailored offers to reduce churn.
  • Immersive Experiences: DLSS and AI-driven NPC behavior make games look better and feel more alive.
  • Revenue & Monetization: AI analytics enhance monetization strategies, increase ad effectiveness, and optimize in-game economies—improvements around 15–25% are reported Gitnux+1.
  • Global Reach & Accessibility: Faster localization and AI chat support reduce response times and broaden global player reach ZipDoGitnux+1.

For studios, these benefits and advantages translate to lower costs, faster release cycles, and stronger player engagement metrics, resulting in less expenses and more revenues.

4. Pitfalls and Challenges

Of course, it’s not all smooth sailing. Some issues include:

  • Bias in AI Systems: Poorly trained AI can unintentionally discriminate—for example, failing to fairly moderate online communities.
  • Failed Investments: AI tools can be expensive to build and maintain, and some studios have abandoned experiments when returns weren’t immediate.
  • Creativity vs. Automation: Overreliance on AI-generated content risks creating bland, formulaic games. There’s worry about AI replacing human creators or flooding the market with generic, AI-crafted content Financial Times.
  • Legal Risks, Ethics & Originality: Issues around data ownership, creative rights, and transparency are raising developer anxiety ReutersFinancial Times. Is AI stealing from artists? Activision’s Black Ops 6 faced backlash over generative assets, and Fortnite’s Vader stirred labor concerns WikipediaBusiness Insider.
  • Technical Limitations: Not all AI tools hit the mark technically. Early versions of NVIDIA’s G-Assist (now patched) had performance problems – it froze and tanked frame rates – but is a reminder that AI isn’t magic yet and comes with risks, especially for early integrators of new tools/solutions. Windows Central.
  • Speed vs. Quality: Rushing AI-generated code without proper QA can result in outages or bugs—human oversight still matters TechRadar.
  • Cost & Content Quality Concerns: While 94% of developers expect long-term cost reductions, upfront costs and measuring ROI remain challenges—especially given concerns over originality in AI-generated content ReutersPC Gamer.

In general, balancing innovation with human creativity remains a challenge.

5. The Future of AI in Gaming

Looking ahead, we can expect:

  • More Personalized Gameplay: Games that adapt in real-time to individual player styles.
  • Generative Storytelling: Entire narratives that shift based on player choices, powered by large language models.
  • AI Co-Creators: Game development may become a hybrid of human creativity and AI-assisted asset generation.
  • Smarter Communities: AI will help moderate toxic behavior at scale, creating safer online environments.
  • Games Created from Prompts: Imagine generating a mini-game just by describing it. That future is teased in surveys, though IP and ethics may slow adoption PC Gamer.
  • Fully Dynamic Games: AI-generated experiences based on user prompts may become a reality, enabling personalized game creation—but IP concerns may limit certain uses PC Gamer.
  • NPCs That Remember and Grow: AI characters that adapt, remember player choices, and evolve—like living game companions WIREDFinancial Times.
  • Cloud & AR/VR Boost Growth: AI will optimize streaming, drive immersive data-driven VR/AR experiences, and power e-sports analytics Verified Market ReportsGrand View Research.
  • Advanced NPCs & Narrative Systems: Expect smarter, emotionally adaptive NPCs and branching narratives shaped by AI AP NewsGitnux.
  • Industry Expansion: The AI in gaming market is projected to swell—from ~$1.2 billion in 2022 to anywhere between $5–8 billion by 2028, and up to $25 billion by 2030 GitnuxWifiTalents+1ZipDo.
  • Innovation Across Studios: Smaller indie developers continue experimenting freely with AI, while larger studios take a cautious, more curated approach Financial TimesThe Verge.
  • Streaming, VR/AR & E-sports Integration: AI-driven features—matching, avatar behavior, and live content moderation—will grow more sophisticated in live and virtual formats Gitnux+2Gitnux+2Windows Central.

With over 80% of gaming companies already investing in AI in some form, it’s clear that AI adoption is accelerating and will continue to grow. Survival without it will become impossible.

6. How Companies Can Stay Ahead

To thrive in this fast-changing environment, gaming companies should:

  • Invest in R&D: Experiment with generative AI, NPC intelligence, and new personalization engines. Become proficient in the key tools and technologies.
  • Focus on Ethics: Build AI responsibly, with safeguards against bias and toxicity.
  • Upskill Teams: Developers and project managers need to understand and use AI tools, not just traditional game engines.
  • Adopt Incrementally: Start with AI in QA and testing (low-risk, high-reward) before moving into core gameplay mechanics.
  • Start with High-ROI Use Cases: Begin with AI applications like testing, balancing, localization, and analytics—where benefits are most evident.
  • Blend AI with Human Creativity: Use AI to augment—not replace—human designers and writers. Leverage it to iterate faster, then fine-tune for quality.
  • Ensure IP and Ethical Compliance: Clearly disclose AI use, respect IP boundaries, and integrate transparency and ethics into development pipelines.
  • Monitor Tools & Stay Agile: AI tools evolve fast—stay informed, and be ready to pivot as platforms and capabilities shift.
  • Train Dev Teams: Encourage developers to explore AI assistants, generative tools, and optimization models so they can use them responsibly and creatively.
  • Focus on Player Trust: Transparently communicating AI usage helps mitigate player concerns around authenticity and originality.
  • Scale Intelligently: Use AI-powered analytics to understand player behavior—then refine content, economy, and retention strategies based on real data.

There will be some trial and error as companies move into the new landscape and try/adopt new technologies, but companies must adopt AI and become good at using it to stay competitive.

Final Word

AI isn’t replacing creativity in gaming—it’s amplifying it. From Ubisoft’s AI bug detection to Roblox’s generative tools and NVIDIA’s AI-enhanced graphics, the industry is already seeing massive gains. As studios continue blending human ingenuity with machine intelligence, the games of the future will be more immersive, personalized, and dynamic than anything we’ve seen before. But it’s clear, AI will not be an option for game development, it is a must. Companies will need to become proficient with the AI tools they choose and how they integrate them into the overall production cycle. They will also need to carefully choose partners that help them with AI implementations that are not done with in-house personnel.

This article is a part of an “AI in …” series that shares information about AI in various industries and business functions. Be on the lookout for future (and past) articles in the series.

Thanks for reading and good luck on your data (AI) journey!

Other “AI in …” articles in the series:

AI in Hospitality

Choosing the Right Chart to display your data in Power BI or any other analytics tool

Data visualization is at the heart of analytics. Choosing the right chart or visual can make the difference between insights that are clear and actionable, and insights that remain hidden. There are many visualization types available for showcasing your data, and choosing the right ones for your use cases is important. Below, we’ll walk through some common scenarios and share information on the charts best suited for them, and will also touch on some Power BI–specific visuals you should know about.

1. Showing Trends Over Time

When to use: To track how a measure changes over days, months, or years.

Best charts:

  • Line Chart: The classic choice for time series data. Best when you want to show continuous change. In Power BI, the line chart visual can also be used for forecasting trends.
  • Area Chart: Like a line chart but emphasizes volume under the curve—great for cumulative values or when you want to highlight magnitude.
  • Sparklines (Power BI): Miniature line charts embedded in tables or matrices. Ideal for giving quick context without taking up space.

2. Comparing Categories

When to use: To compare values across distinct groups (e.g., sales by region, revenue by product).

Best charts:

  • Column Chart: Vertical bars for category comparisons. Good when categories are on the horizontal axis.
  • Bar Chart: Horizontal bars—useful when category names are long or when ranking items. Is usually a better choice than the column chart when there are many values.
  • Stacked Column/Bar Chart: Show category totals and subcategories in one view. Works for proportional breakdowns, but can get hard to compare across categories.

3. Understanding Relationships

When to use: To see whether two measures are related (e.g., advertising spend vs. sales revenue).

Best charts:

  • Scatter Chart: Plots data points across two axes. Useful for correlation analysis. Add a third variable with bubble size or color to generate more insights. This chart can also be useful for identifying anomalies/outliers in the data.
  • Line & Scatter Combination: Power BI lets you overlay a line for trend direction while keeping the scatter points.
  • Line & Bar/Column Chart Combination: Power BI offers some of these combination charts also to allow you to relate your comparison measures to your trend measures.

4. Highlighting Key Metrics

Sometimes you don’t need a chart—you just want a single number to stand out. These types of visuals are great for high-level executive dashboards, or for the summary page of dashboards in general.

Best visuals in Power BI:

  • Card Visual: Displays one value clearly, like Total Sales.
  • KPI Visual: Adds target context and status indicator (e.g., actual vs. goal).
  • Gauge Visual: Circular representation of progress toward a goal—best for showing percentages or progress to target. For example, Performance Rating score shown on the scale of the goal.

5. Distribution Analysis

When to use: To see how data is spread across categories or ranges.

Best charts:

  • Column/Bar Chart with bins: Useful for creating histograms in Power BI.
  • Box-and-Whisker Chart (custom visual): Shows median, quartiles, and outliers.
  • Pie/Donut Charts: While often overused, they can be effective for showing composition when categories are few (ideally 3–5). For example, show the number and percentage of employees in each department.

6. Spotting Problem Areas

When to use: To identify anomalies or areas needing attention across a large dataset.

Best charts:

  • Heatmap: A table where color intensity represents value magnitude. Excellent for finding hot spots or gaps. This can be implemented in Power BI by using a Matrix visual with conditional formatting in Power BI.
  • Treemap: Breaks data into rectangles sized by value—helpful for hierarchical comparisons and for easily identifying the major components of the whole.

7. Detail-Level Exploration

When to use: To dive into raw data while keeping formatting and hierarchy.

Best visuals:

  • Table: Shows granular row-level data. Best for detail reporting.
  • Matrix: Adds pivot-table–like functionality with rows, columns, and drill-down. Often combined with conditional formatting and sparklines for added insight.

8. Part-to-Whole Analysis

When to use: To see how individual parts contribute to a total.

Best charts:

  • Stacked Charts: Show both totals and category breakdowns.
  • 100% Stacked Charts: Normalize totals so comparisons are by percentage share.
  • Treemap: Visualizes hierarchical data contributions in space-efficient blocks.

Quick Reference: Which Chart to Use?

ScenarioBest Visuals
Tracking trends, forecasting trendsLine, Area, Sparklines
Comparing categoriesColumn, Bar, Stacked
Showing relationshipsScatter, Line + Scatter, Line + Column/Bar
Highlighting metricsCard, KPI, Gauge
Analyzing distributionsHistogram (columns with bins), Box & Whisker, Pie/Donut (for few categories)
Identifying problem areasHeatmap (Matrix with colors), Treemap, Scatter
Exploring detail dataTable, Matrix
Showing part-to-wholeStacked Column/Bar, 100% Stacked, Treemap, Pie/Donut

The below graphic shows the visualization types available in Power BI. You can also import additional visuals by clicking the “3-dots” (get more visuals) at the bottom of the visualization icons.

Summary

Power BI, and other BI/analytics tools, offers a rich set of visuals, each designed to represent data in a way that suits a specific set of analytical needs. The key is to match the chart type with the story you want the data to tell. Whether you’re showing a simple KPI, uncovering trends, or surfacing problem areas, choosing the right chart ensures your insights are clear, actionable, and impactful. In addition, based on your scenario, it can also be beneficial to get feedback from the user population on what other visuals they might find useful or what other ways they would they like to see the data.

Thanks for reading! And good luck on your data journey!

Workday’s Game-Changing Move: Acquiring Paradox (an AI Recruitment software player)

Workday announced, alongside its Q2 2026 financial results, a definitive agreement to acquire Paradox on August 21, 2025. The acquisition is a strategic move to integrate Paradox’s conversational AI for high-volume candidate experience into Workday’s enterprise platform. As someone that works in the data space, works with Workday, and supports the HR function, this is of interest to me.

Who They Are

Workday
Founded in 2005, Workday is a leading cloud-based enterprise software provider specializing in human capital management (HCM) and financial services. Trusted by over 11,000 organizations—including more than 65% of the Fortune 500—it helps companies manage payroll, recruiting, and more through its AI-first platform. Wikipedia. Workday has been going all in on AI, which is why they were interested in Paradox.

Paradox
Launched in 2016, Paradox is an innovative player in conversational AI for recruitment. Known for its digital assistant Olivia, Paradox streamlines high-volume hiring processes—handling things like screening, scheduling, and candidate Q&A via chat, SMS, or mobile interfaces. It serves clients like McDonald’s, Unilever, and Chipotle and has powered over 189 million candidate interactions, achieving conversion rates above 70% and cutting time-to-hire to as fast as 3.5 days. PR Newswire+1. That is an impressive time-to-hire statistic!

Why This Acquisition Matters

  1. Extends Workday’s Talent Acquisition Suite
    Integrating Paradox lets Workday offer a full-spectrum hiring solution—from AI-based candidate matching via HiredScore to Paradox’s conversational interface and back-office onboarding through Workday Recruiting—all within one seamless platform. IT ProPR NewswireAInvest
  2. Gains Ground in Frontline Hiring
    Frontline roles (think retail, hospitality, logistics) make up a huge swath of global jobs—roughly 70%. Paradox delivers exactly what that market needs: fast, high-volume, scalable hiring. Josh Bersin even called this “a highly strategic move” that could reshape Workday’s growth trajectory. JOSH BERSINHR Tech Feed
  3. Built-In AI Innovation
    Adding Paradox brings not just the tech, but the talent behind its AI tools. This deepens Workday’s AI capabilities and supports its long-term vision of building an AI agent-centric architecture. JOSH BERSINAInvest
  4. Proven ROI on Day One
    Paradox has delivered significant outcomes—Chipotle saw a 75% reduction in time-to-hire and doubled candidate flow; other clients report streamlined scheduling and improved candidate experience. PR Newswire+1HR Executive

What’s in It for Workday—and how it changes the HR Tech Landscape

For Workday:

  • Expands its offering to include a strong solution for frontline and contingent workers, not just white-collar roles.
  • Enhances AI-driven hiring tools—turning a fragmented process into a unified, intelligent workflow.
  • Likely to drive stronger customer loyalty and cross-sell opportunities.
  • Will immediately add to Workday’s bottomline.

For the broader HR applications market:

  • Sets a higher bar for talent acquisition platforms—more emphasis on candidate experience, AI-driven efficiency, and conversational interfaces.
  • Adds pressure on competitors like Oracle, SAP, and ADP to step up their AI and frontline hiring solutions. ReutersInvesting.com

Final Thoughts

Workday’s acquisition of Paradox isn’t just about buying another tool—it’s a strategic leap into a broader, more intelligent, and more conversational hiring experience for a wider swath of the workforce. With the deal expected to close in Q3 of its fiscal year 2026 (ending October 31, 2025), Workday is positioning itself as a go-to AI-powered talent platform—built for the volume and complexity of today’s global labor markets.

Thanks for reading!

Understanding Microsoft Fabric Shortcuts

Microsoft Fabric is a central platform for data and analytics, and one of its powerful features that supports it being an all-in-one platform is Shortcuts. Shortcuts provide a simple way to unify data across multiple locations without duplicating or moving it. This is a big deal because it saves a LOT of time and effort that is usually involved in moving data around.

What Are Shortcuts?

Shortcuts are references (or “pointers”) to data that resides in another storage location. Instead of copying the data into Fabric, a shortcut lets you access and query it as if it were stored locally.

This is especially valuable in today’s data landscape, where data often spans OneLake, Azure Data Lake Storage (ADLS), Amazon S3, or other environments.

Types of Shortcuts

There are 2 types of shortcuts: table shortcuts and file shortcuts

  1. Table Shortcuts
    • Point to existing tables in other Fabric workspaces or external sources.
    • Allow you to query and analyze the table without physically moving it.
  2. File Shortcuts
    • Point to files (e.g., Parquet, CSV, Delta Lake) stored in OneLake or other supported storage systems.
    • Useful for scenarios where files are your system of record, but you want to use them in Fabric experiences like Power BI, Data Engineering, or Data Science.

Benefits of Shortcuts

Shortcuts is a really useful feature, and here are some of its benefits:

  • No Data Duplication: Saves storage costs and avoids data sprawl.
  • Single Source of Truth: Data stays in its original location while being usable across Fabric.
  • Speed and Efficiency: Query and analyze external data in place, without lengthy ETL processes.
  • Flexibility: Works across different storage platforms and Fabric workspaces.

How and Where Shortcuts Can Be Created

  • In OneLake: You can create shortcuts directly in OneLake to link to data from ADLS Gen2, Amazon S3, or other OneLake workspaces.
  • In Fabric Experiences: Whether working in Data Engineering, Data Science, Real-Time Analytics, or Power BI, shortcuts can be created in lakehouses or KQL (Kusto Query Language) databases, and you can use them directly as data in OneLake. Any Fabric service will be able to use them without copying data from the data source.
  • In Workspaces: Shortcuts make it possible to connect across lakehouses stored in different workspaces, breaking down silos within an organization. The shortcuts can be generated from a lakehouse, warehouse, or KQL database.
  • Note that warehouses do not support the creation of shortcuts. However, you can query data stored within other warehouses and lakehouses.

How Shortcuts Can Be Used

  • Cross-Workspace Data Access: Analysts can query data in another team’s workspace without requesting a copy.
  • Data Virtualization: Data scientists can work with files stored in ADLS without having to move them into Fabric.
  • BI and Reporting: Power BI models can use shortcuts to reference external files or tables, enabling consistent reporting without duplication.
  • ETL Simplification: Instead of moving raw files into Fabric, engineers can create shortcuts and build transformations directly on the source.

Common Scenarios

  • A finance team wants to build Power BI reports on data stored by the operations team without moving the data.
  • A data scientist needs access to parquet files in Amazon S3 but prefers to analyze them within Fabric.
  • A company with multiple Fabric workspaces wants to centralize access to shared reference data (like customer or product master data) without replication.

In summary: Microsoft Fabric Shortcuts simplify data access across locations and workspaces. Whether table-based or file-based, they allow organizations to unify data without duplication, streamline analytics, and improve collaboration.

Here is a link to the Microsoft Learn OneLake documentation about Shortcuts. From there you will be able to explore all the Shortcut topics shown in the image below:

Thanks for reading! I hope you found this information useful.

AI in the Hospitality Industry: Transforming Guest Experiences and Operations

Artificial Intelligence (AI) is reshaping the hospitality industry from guest-facing interactions to back-office optimization and revolutionizing guest experiences and operational efficiency. As hotels, resorts, and travel companies compete in an increasingly digital-first world, AI has become more than just a buzz – despite its challenges and failures – it is a strategic necessity. AI in hospitality is expected to grow 60% per year over the next decade (from 2023 to 2033), going from $90M in 2023 to $8B in 2033. In this article, I will share how AI is being used in hospitality and the benefits being derived or expected from those solutions. I will also touch on some of the challenges. This article is the first of a series that cover AI in various industries and business functions.

How AI Is Being Used in Hospitality

AI applications in hospitality span both guest-facing and operational functions. Examples include:

  • Chatbots and Virtual Assistants: This is one of the most highly used AI tools in hospitality. Many hotel chains use AI-powered chatbots (such as Hilton’s “Connie,” powered by IBM Watson) to handle booking requests, answer FAQs, and provide concierge services.
  • Personalized Marketing and Recommendations: Platforms like Booking.com and Airbnb use AI algorithms to recommend accommodations, activities, and promotions tailored to guests’ preferences.
  • Automated Check-ins: Hotels are rolling out solutions that allow for automated/mobile guest check-ins, sometimes with facial recognition, and digital room keys.
  • Dynamic Pricing: Revenue management systems leverage AI to adjust room rates in real time, based on demand, competition, and historical data.
  • Voice-Controlled Rooms: Smart assistants (Alexa for Hospitality, Google Nest Hub) allow guests to control lighting, temperature, and entertainment hands-free.
  • Predictive Maintenance: AI monitors hotel equipment (elevators, HVAC, kitchen appliances) to predict and prevent failures before they disrupt service.
  • Facial Recognition: Some hotels in Asia use AI-powered check-in systems that identify guests quickly and securely, reducing wait times.
  • Staff Scheduling: AI platforms are being used to optimize staffing across teams and sometimes locations, to allow companies to do more with less people while improving guests’ experiences.

Tools, Technologies, and Methods Behind AI in Hospitality

The AI ecosystem in hospitality is powered by several key technologies and platforms. Here are just a few examples:

  • Machine Learning (ML) for demand forecasting, dynamic pricing, and guest behavior prediction.
  • Natural Language Processing (NLP) for chatbots, voice assistants, and multilingual guest support.
  • Computer Vision for facial recognition check-ins and enhanced security.
  • Robotics for room service delivery (e.g., robot butlers in select Marriott and Yotel properties).
  • Cloud-Based Platforms like Microsoft Azure AI, AWS AI Services, and Google Cloud AI for scalable data processing.
  • AI-Powered CRMs (e.g., Salesforce Einstein, Zoho Zia) for personalized marketing campaigns and guest engagement.

Benefits of AI in Hospitality

Companies that have adopted AI report significant improvements. Some of the known benefits include, but are not limited to:

  • Enhanced Customer Service: 24/7 chatbots provide support and answer guests’ questions instantly. Also, surveys have indicated that a high percentage of guests are comfortable with automated front desks/self-check-ins, indicating their readiness for AI powered guest services.
  • Enhanced Guest Experiences: AI-driven personalization leads to higher satisfaction and loyalty.
  • Operational Efficiency: Predictive analytics and automation reduce costs by optimizing staffing, inventory, and maintenance.
  • Revenue Growth and Management: Dynamic pricing algorithms increase occupancy rates and maximize revenue per available room (RevPAR).
  • Cost Management/Reduction: Through AI Assisted solutions like smart building and equipment systems, staffing optimization, automated supply chain, food management systems, and more, hospitality companies can significantly reduce costs.
  • 24/7 Availability: Chatbots and virtual assistants ensure guests receive support around the clock without adding staffing overhead.

Pitfalls and Challenges of AI in Hospitality

Despite its promise, AI adoption is not without hurdles. In addition to technology or tool challenges, there are also people challenges that impact the implementation and adoption of AI tools. Here are a few challenges, and they are not isolated to the hospitality industry.

  • Failed Implementations: Some hotels have abandoned chatbots due to poor user experiences when systems couldn’t handle complex queries.
  • Bias in AI Systems: Recommendation engines risk unintentionally favoring certain vendors or property types, creating fairness and trust issues.
  • Data Privacy Concerns: Collecting and analyzing guest data for personalization raises regulatory and ethical concerns, especially under GDPR and CCPA.
  • High Implementation Costs: Smaller operators often struggle with the initial investment required for advanced AI systems.
  • Overreliance on Automation: Excessive use of AI can diminish the “human touch” that many guests still value.

The Future of AI in Hospitality

The next phase of AI in hospitality is likely to include:

  • Hyper-Personalization: AI systems will go beyond booking preferences to tailor entire experiences—from menu suggestions to curated itineraries.
  • Generative AI: Personalized travel content (itineraries, local recommendations, even promotional materials) will increasingly be AI-generated.
  • Seamless Multimodal Interfaces: Guests will interact with hotels through integrated combinations of text, voice, and even gesture recognition.
  • Sustainability Optimization: AI will be used to minimize energy consumption and waste, appealing to environmentally conscious travelers.
  • Immersive Experiences: Integration of AI with augmented reality (AR) and virtual reality (VR) to offer “preview stays” or guided tours before booking.

How Hospitality Companies Can Gain an Advantage

To thrive in this rapidly evolving AI landscape, hospitality businesses should:

  1. Start Small, Scale Fast: Pilot AI tools (e.g., chatbots, predictive analytics) in controlled settings before rolling them out property-wide.
  2. Invest in Data Infrastructure: High-quality, integrated data systems are essential for effective AI.
  3. Balance AI with Human Service: Use AI to enhance—not replace—the human element that defines hospitality.
  4. Prioritize Ethical AI: Ensure AI systems are transparent, unbiased, and compliant with privacy regulations.
  5. Foster a Culture of Innovation: Train staff to work alongside AI tools, and encourage adoption through upskilling and change management.
  6. Partner Strategically: Collaborate with AI technology providers, startups, and academic institutions to stay ahead of the curve.

Conclusion

AI is not just a tool for the hospitality industry—it’s a catalyst for reimagining the guest journey and the operational efficiency. While challenges exist, companies that harness AI responsibly and strategically stand to unlock new levels of personalization, efficiency, and growth. Those who hesitate may find themselves outpaced by competitors who use AI to transform service from reactive to predictive, and from transactional to truly memorable. And its adoption and effectiveness are expected to continue to grow with an estimated 60% to 70% of hotels, travel agencies, and short-term rentals planning to adopt or expand their use of AI.

As mentioned earlier, this article is one of a series of many articles that share information on AI in various industries and business functions. Be on the lookout for the future articles in the series. Thanks for reading! Good luck on your data journey!

Creating a DATE value in Power BI DAX, Power Query M, and Excel

You may at times need to create a date value in Power BI either using DAX or M, or in Excel. This is a quick post that describes how to create a date value in Power BI DAX, Power Query M language, and in Excel. Working with dates is an every-day thing for anyone that works with data.

In Power BI DAX, the syntax is:

DATE(<year>, <month>, <day>) //the parameters must be valid numbers

DATE(2025, 8, 23) //returns August 23, 2025

In Power Query M, the syntax is:

#date(<year>, <month>, <day>) //the parameters must be valid numbers

#date(2022, 3, 6) //returns March 6, 2022

In Excel, the syntax is:

DATE(<year>, <month>, <day>) //the parameters must be valid numbers

DATE(1989, 12, 3) //produces 12/3/1989 (officially returns a number that represents the date in Excel date-time code)

Thanks for reading. Hope you found this useful.

Microsoft Fabric OneLake Catalog – description and links to resources

What is OneLake Catalog?

Microsoft Fabric OneLake Catalog is the next generation, enhanced version of the OneLake Data Hub. It provides a complete solution in a central location for team members (data engineers, data scientists, analysts, business team members, and other stakeholders) to browse, manage, and govern all their data from a single, intuitive location. It provides an intuitive and efficient user interface and truly simplifies and transforms the way we can manage, explore, and utilize content in Fabric. Usage is contextual and it has unified all Fabric item types (including Power BI items) and expanded support to all Fabric item types, integrating experiences, and providing detailed views of data subitems. It is a great tool.

Why use OneLake Catalog?

This tool will make your work within Fabric easier, and it will reduce duplication of items due to improved discoverability, and it will enhance our ability to govern data objects within the platform. So, check out the resources below to learn more.

Here is a link to a detailed Microsoft blog post introducing the OneLake Catalog:

And here is a link to a Microsoft Learn OneLake Catalog overview:

And finally, this is a link to a great, short (less than 5 min) video that gives an overview of the OneLake Catalog:

Thanks for reading! Good luck on your data journey!

Using MAXX in Power BI to find the Latest Event Date across several event date columns in each row

We were working with some HR data which included multiple date fields such as Hire Date, Transfer Date, Promotion Date, and Termination Date. We needed to determine the most recent event date before termination. We ended up using the MAXX function to do this.

Sample dataset to demonstrate the scenario:

Using the following dataset to demonstrate the example:

EmployeeHire DateTransfer DatePromotion DateTermination Date
Alice2020-01-152021-05-102022-03-202023-06-15
Bob2019-11-012020-07-152021-10-05(blank)
Carol2021-03-25(blank)2021-09-142022-02-28

The goal is to calculate the most recent event and event date (i.e., the latest event and its date) between Hire Date, Transfer Date, and Promotion Date for each row. Termination Date was excluded from the comparison because the goal was to find the latest event before Termination (if that had occurred).

Using MAXX for Row-by-Row Evaluation

MAXX is an iterator function in DAX, meaning it evaluates an expression for each row of a table, then returns the maximum value. Iterator functions such as MAXX and SUMX work row-by-row over a table, in contrast to aggregate functions like MAX and SUM which operate over an entire column at once.

  • Aggregate example (MAX): Finds the highest value in a column across all rows.
  • Iterator example (MAXX): Evaluates an expression for each row in a virtual table, then finds the highest value.

This makes MAXX ideal for scenarios like this where the various dates are in multiple columns of the same row, and we need to find the max of these dates on each row.

DAX Code Example: This is an example of the code that was used to derive the latest event date.

Latest Event Date =
MAXX(
{
[Hire Date],
[Transfer Date],
[Promotion Date]
},
[Value]
)

Code Explanation:

  1. Create a virtual table with one column and three rows—one for each date we want to consider.
  2. MAXX iterates through this virtual table, evaluates [Value] (the date), and returns the latest / maximum (max) date for each iteration.

Expected Output based on the sample dataset:

EmployeeHire DateTransfer DatePromotion DateTermination DateLatest Event Date
Alice2020-01-152021-05-102022-03-202023-06-152022-03-20
Bob2019-11-012020-07-152021-10-05(blank)2021-10-05
Carol2021-03-25(blank)2021-09-142022-02-282021-09-14

This is much cleaner than using nested IF checks to determine the latest date / latest event for each record. Of course, the MAXX function can be used in other scenarios where you want to find the max value across multiple columns on each row.

Thanks for reading and I hope you found this useful!