With a recent Pew Research report showing that 50 percent of Americans are more concerned than excited about AI, and only 10 percent more excited than concerned, employers who want employees to learn AI skills have a tough slog ahead. Here’s an idea: if you want employees to adopt generative AI, put them in a room, give them real data, and require working demos by the end of the sprint.
The strongest case for this format comes from how learning and adoption actually happen. A widely cited[active learning meta-analysis](https://www.pnas.org/doi/pdf/10.107…
With a recent Pew Research report showing that 50 percent of Americans are more concerned than excited about AI, and only 10 percent more excited than concerned, employers who want employees to learn AI skills have a tough slog ahead. Here’s an idea: if you want employees to adopt generative AI, put them in a room, give them real data, and require working demos by the end of the sprint.
The strongest case for this format comes from how learning and adoption actually happen. A widely citedactive learning meta-analysis found higher performance and fewer failures when people solve problems directly rather than listen to lectures. A rigorousproject-based learning review ties artifact creation to gains in achievement. A 2025working paper on generative AI in customer support measured a 14 to 15 percent productivity lift, with larger gains for novices — evidence that guided tools and exemplars accelerate capability on the job.
Organizational change also follows visibility and peer proof, which is the core of literature synthesized in a classicdiffusion analysis. Put people together, give them governed data, and require five-minute demos to executive judges. That is the fastest way to move from interest to working software and from working software to credible pilots.
Time pressure and coaching focus attention. A recent hackathonsystematic review maps how short, intense builds drive teamwork, problem solving and persistence when organizers set clear goals and provide structure. A complementary educational evaluation reaches similar conclusions and highlights the value of facilitators who unblock teams during the sprint. The result is not only faster skill acquisition but a clearer path to standardization, governance and scale.
The American Society for Nondestructive Testing turned the research into a program that shipped results in public. Its 2025 conference positioned an AI Agent Battle as a marquee experience on the official agenda, complete with a two-day, build-and-compete format tied to practical workflows. The organization primed the field before the showdown through a publicwebinar that introduced agent patterns, build steps and governance expectations, which lowered activation energy for first-time builders.
The structure mattered. Attendees did not sit for long lectures; they built agents tied to real inspection tasks, iterated in public and showed results on a deadline. That format aligns with strong evidence that active learning outperforms lecture-first instruction, including a well-citedmeta-analysis that found higher performance and lower failure rates when learners engage directly with problems.
Reviews of project-based learning show similar gains, as documented in a recent higher-educationreview and a science-educationmeta-analysis. Research on hackathon-style builds also points to improved teamwork, problem solving and persistence when the event is time bounded and well coached, as summarized in a 2024systematic review and a complementary educationalevaluation.
So how should business and government leaders adapt this model?
Start by promising what matters to executives: working demos on a clock that address real workflows. Publish an internal schedule complete with sprint start, demo window and judging criteria. Staff expert facilitators to roam as unblockers rather than lecturers. Offer a short pre-brief a week before the build, as the American Society for Nondestructive Testing did, where you introduce agent patterns your business needs and review data guardrails. Provide a sandbox that mirrors production constraints. Preload governed, redacted or synthetic datasets so that teams can build safely without waiting on approvals.
Treat the workshop like a product launch, not a class. Give it a name, publish rules and state deliverables up front. Require three artifacts from every team by the final bell: a short problem statement, a must-have capability checklist, and a data access plan that names sources and permissions. Record every demo and publish them on an internal portal. Tag entries by workflow and data domain, and include a lightweight request form for productionization. Commit to a two-week decision window for the strongest prototypes to move into controlled pilots. As teams progress, connect their outcomes to the enterprise business case to report throughput gains.
Finally, make sure to close the loop before momentum fades. Ask each team to submit a one-page risk register that captures data dependencies, security exposures and monitoring needs. Stand up a lightweight review that approves top prototypes for pilots. Begin the next quarter’s build with quick updates from prior winners showing movement on cycle time, defect rates or satisfaction. Over time, you build a library of approved, reusable agents and a standing competition that sources the next candidates.
The effect is cumulative. The diffusion analysis predicts faster uptake when exemplars are visible, and coached, time-boxed practice makes those exemplars stick.
A well-run build workshop is not theater. It is an evidence-backed way to translate generative AI from headlines into operating leverage. The research behind active learning, project-based practice, and hackathon design favors coached, time-bounded builds that culminate in visible demos. Leaders who adopt this model will leave not with slide decks but with demos, data plans and pilots they can fund immediately. That is how education becomes deployment, and deployment becomes sustained competitiveness.
Gleb Tsipursky, Ph.D., serves as the CEO of the hybrid work consultancy Disaster Avoidance Experts and authored the best-seller “Returning to the Office and Leading Hybrid and Remote Teams.”
Copyright 2025 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.