IT and R&D

Remote

Backend Python Developer - Stats Team

43

current job openings

30+

global offices

1,500+

people on board

We are:

RTB House is a global company that provides state-of-the-art marketing technologies for top brands and agencies worldwide. Its proprietary ad-buying engine is the first in the world to be powered entirely by Deep Learning algorithms, enabling advertisers to generate outstanding results and reach their goals at every stage of the funnel. 

As a backend developer, you will join a team developing a platform that processes huge amounts of source data into compact aggregates published through fast, modern APIs. You will have the opportunity to work with the latest technology stack in a mature project of a very large scale, where the main emphasis is on the reliability and high performance of the solutions created.

 

You will:

Design and implement efficient processes for processing and aggregating large datasets;

Implement and maintain solutions for sharing calculated statistics with other systems and internal users;

Modernize and optimise existing systems in response to the ever-increasing scale;

Ensure the reliability and scalability of the built solutions;

Create high-quality code that addresses business needs;

Collaborate with other teams to build components of a distributed system;

Participate in building a technical culture based on caring for the quality of created solutions.

 

Desired Experience: 

Experience in programming in Python, using modern mechanisms, including asyncio, typing;

Knowledge of concurrent and asynchronous programming;

Very good knowledge of SQL databases (PostgreSQL, BigQuery is a plus);

Ability to design and implement services (HTTP, REST, GraphQL is a plus);

Knowledge of system architecture issues with an emphasis on performance and scalability;

Experience working with large datasets;

A sense of responsibility for the delivered solutions;

Communicativeness, ability to work in a team.

C1 level in English and Polish

 

Selected technologies used:

Python (including asyncio, FastAPI, Pydantic, typing, linters);

Google Cloud (including BigQuery, Cloud SQL for Postgres);

PostgreSQL (raw SQL, query builders);

Docker, Kubernetes, Argo CD, Argo Workflows;

Github;

Sentry.

 

Sample topics:

Creating a job that calculates aggregates based on raw data;

Optimizing the execution time and cost of a job that calculates aggregates;

Optimizing the resource consumption of a PostgreSQL instance by selecting configuration parameters;

Building an HTTP service that provides data to users;

Designing solutions for efficient storage and management of data in partitioned PostgreSQL tables;

Migrating to Capacity Based pricing BigQuery from the On-Demand model;

Creating a dashboard to observe the performance and costs of maintained jobs;

Migrating tables from an on-premise database to an instance on Cloud SQL;

Creating a mechanism for tracking changes in tables with source data;

Creating an asynchronous client for BigQuery;

Migrating a service from Flask to FastAPI;

Optimizing queries to BigQuery by using partitioned tables or materialized views.

 

We Offer: 

A friendly atmosphere of cooperation with clearly defined tasks, in an organized team of professionals;

Exceptionally flexible terms of cooperation;

Access to the latest technologies and the ability to actually use them in a large-scale and highly dynamic project.