Reluvate
AI Content Generation for a Major FinTech Industry Conference

Financial Technology & Media

·Singapore·4 months

AI Content Generation for a Major FinTech Industry Conference

Developed an AI content generation system that transformed roundtable discussion transcripts from a major FinTech industry conference into structured, publication-ready reports. The system processed hours of expert roundtable recordings, extracted key themes and insights, and generated polished reports that were published as official conference outputs.

Days

Report turnaround (was weeks)

Dozens

Roundtable sessions processed

Publication-ready

AI-generated report quality

Challenge

Major industry conferences like FinTech Festival convene roundtable discussions where senior executives, regulators, and thought leaders debate critical topics — the future of digital banking, regulatory approaches to crypto assets, AI governance in financial services, cross-border payment infrastructure, and similar themes. These discussions generate enormous intellectual value, but capturing and distributing that value is a persistent challenge. The traditional approach was to have note-takers at each roundtable, then assign writers to produce summary reports from the notes. This process was expensive (professional writers are not cheap), slow (reports appeared weeks after the event, when interest had waned), and lossy (note-takers missed nuances, writers interpreted notes subjectively). The conference organisers wanted to produce high-quality reports from dozens of roundtable sessions within days of the event, which was impossible with a purely manual workflow. The quality bar was high. These reports would be published under the conference brand and distributed to an audience of senior financial services professionals. They needed to be accurate (no misattribution of views), balanced (fairly representing different perspectives expressed in the discussion), insightful (adding analytical value beyond just summarising what was said), and professionally written (suitable for C-suite readership). Generic AI summarisation tools produced outputs that were accurate but bland — lacking the analytical depth and professional tone required.

Approach

Reluvate built a multi-stage content generation pipeline. The first stage processed audio recordings of roundtable discussions through speech-to-text transcription, with speaker diarisation to attribute statements to individual participants. The transcription was then cleaned and segmented by topic, with the AI identifying natural discussion transitions and thematic clusters within each roundtable. The analytical engine — the core of the system — went beyond summarisation to produce genuine analysis. For each thematic segment, the system identified the key arguments presented, the points of consensus and disagreement among participants, the implications of the discussion for the broader industry, and connections to themes from other roundtable sessions at the same conference. This cross-session synthesis was particularly valuable: it revealed patterns and convergences across discussions that no individual attendee could have observed. The report generation stage produced structured, publication-ready documents following the conference's editorial guidelines. Each report included an executive summary, thematic analysis sections, key quotes from participants (verified against the transcript for accuracy), and forward-looking implications. The tone was calibrated to match the conference's existing publication style — authoritative, analytical, and accessible to a senior professional audience. Draft reports were reviewed by the conference editorial team and required minimal editing before publication.

Design Notes

The most critical design decision was separating the analytical layer from the generation layer. Many AI content tools generate text directly from source material, but this produces superficial summaries that don't add analytical value. Reluvate's pipeline first extracts structured analytical elements — arguments, positions, evidence, implications — and then generates text from this structured analysis. This two-stage approach produces reports that are genuinely insightful rather than merely summarising what was said. Change management was focused on the conference editorial team, who were initially sceptical that AI could produce content meeting their standards. Reluvate ran a blind evaluation: the editorial team reviewed reports from three roundtable sessions without knowing which were AI-generated and which were human-written. The AI-generated reports were rated comparable in quality and superior in turnaround time. This proof point was essential for editorial buy-in. Exception handling accounts for the unique challenges of processing discussion content. Roundtable discussions often include off-the-record comments, preliminary views that speakers explicitly qualify as personal rather than institutional positions, and sensitive commercial information shared in confidence. The system flags these segments based on linguistic cues ("off the record," "speaking personally," "this is confidential") and excludes them from published reports. Additionally, all participant quotes included in reports are verified against the original transcript to prevent misattribution — a critical requirement given the seniority of the participants.

Result

The conference produced publication-ready roundtable reports within days of the event, compared to the weeks required by the previous manual process. Report quality was validated by the editorial team as meeting their publication standards. Cross-session thematic analysis provided insights that weren't available from individual roundtable reports, adding a new dimension to the conference's intellectual output. The system has been retained for use at subsequent conference events.

content-generationNLPFinTechconferencetranscriptionanalytics

Facing a similar challenge?