SyncWords isn’t about just adding subtitles and voice translations to live streaming, it’s about disruption.
Today, I’m talking with Ash Shah, CEO of SyncWords. SyncWords is a platform that powers live video streams with accurate subtitles and dubs using AI. Traditional broadcasters are adopting FAST and streaming to survive, enabling them to reach a broader audience and adapt to consumer preferences for content. This transition necessitates re-engineering their infrastructures for agility in launching new delivery services.
While streamers and content creators have figured out affordable and convenient ways to add subtitles and dubs for video-on-demand (VOD) content, doing this for live streaming has been a big challenge, primarily because live streaming workflows are complex and old traditional methods including hardware encoders are expensive, and unaffordable.
Enter SyncWords and AI developments, and you have an advanced patented cutting-edge technology to generate accurate captions, subtitles and voice dubs in real time and inject them into live streams with a very low latency of a few seconds. Plus with modern streaming protocols of HLS (HTTP live streaming), there is language support in every available language. This is a game-changer for live streaming, including live sports, news as well as live corporate and government events, which can be streamed to global audiences in languages of their choice.
Having raised $2 million in seed funding, Ash aims to make SyncWords’ technology ubiquitous so that a major portion of live streaming contains captions, and live translations, democratizing live video content for worldwide audiences.
SyncWords’ All-in-One Platform for Enhanced Accessibility and Inclusivity
Q: Can you describe the market opportunity SyncWords is pursuing? What are the pain points you are solving for your customers?
A: There are over 200 billion hours of live streaming taking place every year, and only a tiny percent of it has live subtitles or voice translations – dubs. We are talking about live streaming for sports, news, gaming and events. Over 10% of the world’s population is deaf or hard-of-hearing, and over 80% of the global population understand English very well, and hence we have a problem. Furthermore, the United States, Canada, Australia, European Union and a growing number of countries are mandating video accessibility. So, this has been an unsolved problem since the start of video streaming. We estimate that translating videos can grow the audience size by 5X. Therefore, video streaming companies are taking video localization very seriously.
Live content needs to be captioned and translated in real time, with subtitles and dubs inserted in the live video. Unlike VOD content which is file-based, live streaming presents far greater workflow challenges, making the problem hard to solve. Current solutions for live captioning include hardware encoders and human captioners, both of which are expensive and challenging to manage. No company was able to crack the code, until SyncWords solved the problem with its innovative platform.
Q: Your company has done a couple of pivots before, what is special about this pivot?
A: In the AI era, companies have to pivot to survive, and if you pivot quickly enough, it can lead to a huge technical advantage for your company. The danger lies in using AI to become another “me too” product. And this can happen if a company jumps into a competitive market where many similar products are already available.
When we formed SyncWords, our product back then was captions and subtitles for video-on-demand, which was a competitive market, and although we had innovative technology to automate the process, it was a competitive market. We also developed a product for live transcribing events in real-time in the form of widget captions and translations. No live dubs at that time, just text. And during COVID, everything seemed to sell, even these widget captions. You can see that we acquired some great customer logos during COVID.
However, of course, 99% of our customers preferred viewing subtitles and audio dubs inside a video player, and not outside in a widget. My fellow co-founders, Alex Dubinsky and Sam Cartsos, also realized this, and in 2022 we pivoted to build a platform for in-player live subtitles and dubs for video streaming. We launched the platform in December 2023, and have since signed numerous new customers. Remarkably, we bootstrapped our company for most of the product development using the revenues from our legacy product. To date, we have secured $2 in seed funding, and are looking to raise a new round after we achieve certain milestones, which we are confident of meeting given the demand and the new business from the new platform.
SyncWords Founding Team: Ashish Shah, Sam Cartsos and Alex Dubinsky, Near the Company’s Headquarters in New York, USA
Q: What were the key contributors to successfully solving a challenging technology problem?
A: Clearly, it’s a big lift to develop software to solve a challenging problem, especially one that has been unsuccessfully attempted by other companies. It was very clear to us that the industry was in a need for such a solution. Let me tell you what makes the problem highly challenging. First, you are dealing with Live streaming protocols, very unlike the file based format for Video- on-demand, making it much more challenging to work with. There is no open source code available for Live, therefore we had to build everything in-house at SyncWords.
The key aspect of our approach was our architecture. Adopting a scalable cloud-based Kubernetes architecture was essential. We also were maniacal about user experience and the accuracy of our subtitle and dub outputs, which are generated by AI, and produce 94% accuracy on our core high resource language pairs. The accuracy of AI translations continues to rise, and we expect more language pairs to enter the high accuracy zone in 2024. We also did not reinvent the wheel on AI modalities and used best-of-breed speech recognition and translation AI providers to generate a highly accurate solution for our customers.
And finally, the SyncWords team comprises people with diverse technology backgrounds – live streaming, video, AI, captioning and voice, which was key in cracking the problem. Alex’s key contributions to the product have enabled us to create a unique offering with a focus on low latency subtitles and dubs, and scalability using Kubernetes. The majority of the team are our engineers, and we are adding more people to sales and marketing.
Q: What are the unique highlights of your technology?
SyncWords isn’t just keeping up; we’re leading the way in live captioning and translations for streams and events.
Future-Ready Streaming Protocols: While RTMP(S) has been a staple, the future is all about HLS and SRT protocols. SyncWords supports them all, making your live streaming more agile and scalable. SRT (Secure Reliable Transport) is a newer streaming protocol which is fast gaining popularity among broadcasters and OTT platforms.
Multi-Language Magic and Voice Variety: Imagine captioning in over 50 languages and translating into 100+ languages within minutes, even during concurrent live streams. Plus, you can choose from 900 different voices, including regional accents, for automated live dubbing that’s truly inclusive. SyncWords crushes language barriers, making global communication seamless and instant.
Seamless Integration: SyncWords fits right into your current live streaming workflow without needing any software changes. The solution is also fully compatible with AWS Elemental MediaPackage, MediaLive, and MediaConnect, and we’re the only one on the market to offer it.
Top-Tier Tech Specs: We’ve nailed the essentials for live streaming: support for UHD and HAVC codecs, low latency, PIDs, audio channels, and API capabilities. So, our cloud-based solution ensures you are all set for 24/7 live programming that’s inclusive and operationally seamless.
High-End Live AI Captioning, Subtitling and Dubbing for HLS, RTMP(S) and SRT Live Streams
Q: Will the demand for live captions and translations keep growing?
Absolutely, and here’s why. First, the amount of live video content is skyrocketing. More live streams mean more need for captions and translations. Second, new accessibility and inclusivity rules are pushing live streaming providers to step up their game. They need to meet these standards or face hefty fines. And lastly, it’s what viewers want. People crave a more engaging and inclusive viewing experience.
Plus, the influence of platforms like Netflix, Amazon Prime, and Hulu can’t be ignored. They’ve set the bar high with their wide range of language options for captions, subtitles, and audio dubbing. Now, viewers expect the same flexibility for live programming. So, yes, the demand is definitely going to keep growing, and SyncWords is going to be a huge beneficiary of this rapid growth.
{Categories} _Category: Takes{/Categories}
{URL}https://techbullion.com/why-syncwords-ceo-ash-shah-thinks-live-video-streaming-is-entering-a-renaissance-with-ai/{/URL}
{Author}Miller Victor{/Author}
{Image}https://techbullion.com/wp-content/uploads/2024/07/SyncWords-1000×600.png{/Image}
{Keywords}Technology,Ash Shah,SyncWords{/Keywords}
{Source}POV{/Source}
{Thumb}{/Thumb}