Software Developer - Data Pipelines

Software Developer - Data Pipelines

Thrilled to work with the LARGEST datasets of our times?

Everybody understands the importance of big data, but few developers have the chance to work with truly massive datasets. Join us if you want to tackle one of the most voluminous data sources in our world today: the massive network-level information from telecommunication networks. You will get to work on and improve data pipelines analysing over half a billion subscribers.

Why join us?

To help telecommunications providers optimise and expand their networks. By providing them with critical information about the performance of thousands of applications and hundreds of millions of devices, you will help telecommunications providers make important business decisions while keeping systems like Zoom and Netflix performant and accessible to everybody.

To learn fundamental data engineering skills. You will master techniques to accelerate data exploration in massive data, such as judiciously selecting, and parallelising aggregations ahead of time, and performing raw data indexing to support drill-downs into trillions of records stored at tens of geographically distributed data centres.

To learn to design systems that operate close to H/W limits. Typical software developers enjoy an overabundance of resources, but you will operate on the brink of powerful hardware capacity. Despite working with hundreds of cores per CPU, several CPUs and Terabytes of RAM per node, with large RAID arrays, you will have to carefully calculate the costs of each data pipeline component in throughput or memory to ensure that maximum business value is delivered without resource waste. Your software will be deployed in customers' premises and it will use up to an order of magnitude less hardware and electricity than the competition.

To work on a core technology. While most developers are comfortable using ready-made APIs, you will be building the APIs used by others to access valuable information that is only available through our pipelines.

To work within a highly talented team. You will join a team of experts in high performance, data science, security, UI, UX, telecommunication protocols and business analytics, working together across multiple countries, and enjoying many opportunities for role and geographical rotations.


Are you a good fit?

Our culture is goal-driven and quality-focused. We believe in using the right tool for the job, whether that's a classic UNIX utility or an industry-standard data processing tool. And if existing tools fail to perform, we build our own.

Skills that we value:

  • Programming
  • Good communication and analytical skills
  • Python
  • Program design and refactoring
  • Linux shell scripting
  • Familiarity with SQL

In summary, we want people who will enjoy building, maintaining, measuring, and optimising high performance data pipelines. We are open to playing with new technologies, but we build production systems by focusing on solid principles and simplicity (simplification is often the best optimisation). As we keep growing, we depend on and share a spirit of continuous improvement. Join us if you think you can help us achieve more.


If you feel that you've got what it takes to work with us and want to join a dynamic team, apply now with your updated CV, details about your motivation, your salary expectations and your earliest possible start date at