Big Data

Big Data Software Outsourcing

Engineering software platforms for High Volume Transaction Processing (HVTP) require a range of skills and tools in order for every piece of the application to work harmoniously and efficiently. It also means thinking quite differently. This post gives a bit of insight into the mind of a software developer working with challenges of scale…

At Techifide, we start with the infrastructure and think about scalability first. The servers must be ready to grow if needed – possibly automatically and before reaching critical points. If you think your application is demanding today, check it again tomorrow. With social media playing such an important role in marketing nowadays, growth can far exceed what one might project, with very little warning. We need to be ready to welcome double, and then double again, the number of users you have today.

It’s about thinking of each piece of the software puzzle individually, achieving the optimum result by giving each one equal importance. Every single part of the system must be prepared for volume: the physical machine, the database, and the code must all be equally primed for high-performance.

On the code side, we prefer event-driven languages for software development as they’re excellent for dealing with memory. Our favourite options are: Node.js, Erlang, Elixir, Scala, and Ruby. If the data requires a large amount of table joins we recommend a NoSQL DB. Putting a NoSQL hat on, we can start thinking about the data as part of a large net, where each record is directly connected to the subsequent records around it. If the data is indeed relational, we’d try to de-normalise the tables and create indexes where appropriate, for efficiency.

Right from the beginning, the focus is on code optimisation. Using design patterns like the Single Responsibility Principle, it helps to keep things organised by applying patterns like the Model View Controller Pattern, which is valid for both front- and back-end areas.

And my do we need to Test! It’s possible to perform simulated concurrent API connections with a third-party application. Here at Techifide, our extensive QA demands inspired us to create a dedicated application just for that. We start by opening 500K concurrent connections and increasing them up to 100M. At each stage, we check CPU and RAM usage on the API’s side, spot bottlenecks, and improve the application as we go along.

Managing large scale user demand is increasingly common as developing Software as a Service grows in popularity – something we specialise in at Techifide.