- • Client has an ad-exchange platform that facilitates buying and selling of media advertising inventory from multiple ad networks
- • With bidding happening in real time, lot of distributed data processing and aggregation is required to analyze
- • Maintaining Datacenter with 74 physical servers
- • Automating the Infrastructure provisioning
- • Automating and maintaining the Big Data solution stack
- • Key Challenges to be addressed:
- • 23 – 25 GB of data generated every day/210 TB in 2 months
- • 200 micro-seconds latency benchmark
- • Creating custom Docker images containing application services
- • Writing docker files to build custom docker images.
- • Configured Docker registry service to store custom images
- • Custom bridge along with Docker network to configure static IP address for production usage. Bridge network allowed multi-host communication across docker containers.
- • Docker containers talked to kafka cluster running on different host machines.
- • Automated container configuration management using saltstack modules.
- • Tested Docker orchestration using Docker Swarm and Apache Mesos
