All cases

A high-performance stories experience for a social media ecosystem

Client
A major player in social media and entertainment
Industry
Entertainment, social media, live streaming
Services
Custom software development, staff augmentation, mobile development
Tech stack
Node.js, DynamoDB, AWS S3, AWS CloudFront, AWS Elemental MediaConvert, Redis, RabbitMQ, WebSocket, Jetpack Media3, ExoPlayer, OpenGL, Canvas API, WorkManager, Kotlin, Swift, Objective C, SwiftUI, Combine, CocoaPods, Swift Package Manager

Challenge

The client—a dominant entertainment and social media company operating multiple high-traffic dating, live-streaming, and social interaction apps with millions of users worldwide—wanted to evolve their video streaming service to attract top creators, boost engagement, and strengthen monetization. The company turned to ITRex to contract experienced developers and engineers. Their objective was to introduce a Stories feature, similar to Instagram or Snapchat, but tailored for their ecosystem of social and streaming apps. This wasn’t a cosmetic improvement. For the client, Stories needed to become a strategic growth engine—a tool for streamers to announce upcoming live sessions, share exclusive updates, and engage fans without going live.

Solution

ITRex delivered a full-scale Stories capability engineered from the ground up for performance, flexibility, and creator-driven engagement. Because nothing off-the-shelf met the client’s requirements, our team built both back-end and mobile experiences using deeply customized logic. Below we explain the functionality from the perspective of story creators and viewers. Creator experience: a powerful story creation toolkit We built a rich authoring environment that lets creators produce dynamic, expressive stories. Creators can use the following capabilities:
Robust media editing with multiple text layers, colors, scaling, tagging, repositioning, and resizing
Smooth video clipping without overloading device memory
Reliable upload workflows that can handle interruptions, lost connections, or background mode
Simultaneous uploads with custom rules and resource-safe scheduling to cope with uploading multiple stories at once
Viewer experience: responsive, fluid, and built for interaction The viewer side required a sophisticated gesture and playback engine that could handle any combination of user movements, including:
Swipe to next/previous story
Rewind and fast-forward without releasing the finger
Interrupt and resume
Layered navigation between different story creators
The ITRex team designed a state management system that interprets multiple gestures simultaneously and keeps playback stable even under rapid user interactions. Reactions: a flexible system for engagement and creator feedback As part of the Stories feature, the client wanted viewers to react to stories using customizable emojis and story creators to see their audience’s responses. To comply, our team designed an extensible architecture that:
Allows viewers to add reactions that will be visible to creators immediately
Includes validation logic to ensure users can’t react to expired stories or those they already reacted to
Preserves and displays a full reaction history
Automatically cleans reactions when a story expires or is deleted
Can be extended with different reaction types (symbols and emojis) if needed in the future
Custom-Stories-Feature-Development-for-a-Social-Media-Ecosystem.
Stories-Feature-Development

Tech

To deliver the functionality described above, our team deployed the following technologies. Back end The client’s platform relies on a microservices architecture, so we built the entire Stories capability around AWS-native components. Coordinating all the different microservices among each other and with the user’s device was a challenge: stories expire after 24 hours; creators delete individual stories; users remove their entire accounts. Each of these events must immediately reflect on the viewer’s device so users never see expired or removed content in their story marquee. Another difficulty was ensuring that users could resume a story exactly where they left off. The Stories feature had to survive interruptions—phone calls, backgrounding, low battery—while also accounting for constantly changing story availability. Yet another challenge was in implementing reactions. The reaction integration service had to be as abstract and general as possible to support configuring new types of reactions in the future. Our team also introduced a notification center to maintain reaction history. It was also designed to support more diverse notification types in the future. To maintain speed and responsiveness, we created a dual-layer storage and synchronization approach that provides real-time updates while maintaining reliable long-term records. Architecture: We used the following AWS services:
AWS Elemental MediaConvert to process uploaded videos and convert them into Apple HLS format for faster and smoother streaming
AWS S3 and CloudFront to store and deliver stories with low latency
AWS DynamoDB as the primary database for story metadata, reactions, user states, and more
Other tech includes:
Redis for short-term storage to speed up response time and reduce DynamoDB load
Node.js for microservices to orchestrate all interactions
HTTP and AMQP for efficient inter-service communication
WebSocket protocol to handle communication between client and server, such as delivering real-time updates
Mobile client The front-end development pushed far beyond standard mobile development. To offer a rich media and video-processing experience, our engineers implemented custom algorithms, precise mathematical manipulations, and careful optimization to keep performance smooth on devices with limited memory and power. The mobile environment added more complexity, as any app can be interrupted at any moment by calls, notifications, or backgrounding. Yet story creation, processing, and upload still had to remain reliable under all conditions. The interface itself introduced another challenge. Users interact with stories through a wide range of gestures, and the app needed to interpret every one of them without lag. We had to prepare for a multitude of edge cases: rewinding while a finger remains on the screen, jumping between stories, switching creators mid-gesture, and rapidly combining multiple actions. Architecture:
OpenGL and Android Media3 with custom matrix transformation on the phone’s graphic adapter to streamline content creation and video rendering
WorkManager framework to reliably render and upload media data in the background given the restrictions of mobile devices
Combine framework for handling complex, asynchronous events
ExoPlayer to ensure stable, high-quality playback across a wide range of devices and network conditions
Canvas api to support complex layered editing and media transformation Swift Package Manager for handling external libraries and dependencies
Kotlin, Swift, Objective-C as programming languages

Impact

Attracting high-profile streamers by giving them a fast, flexible way to announce upcoming live sessions and engage audiences without going live
Increasing daily user engagement as viewers interact with creator stories, react in real time, and return more frequently to follow updates
Reducing operational overhead through a scalable microservices architecture and optimized video workflows that lower processing and delivery costs
Strengthening product extensibility, as the client can now introduce new reaction types, media formats, and notification categories without re-architecting the system
Establishing a foundation for future monetization, as high-engagement story formats open opportunities for premium creator tools, paid reactions, and promoted content
Stories-Feature-Development-for-a-Social-Media-Ecosystem

Latest projects