Analytics Engineer | Junior - Mid | Infrastructure DA team
Full Time
hybrid
Infrastructure
Vilnius / Kaunas
The world’s most advanced VPN, and a whole lot more.
If you’re a curious problem-solver who carves their own path, join the team behind Threat Protection Pro, the NordLynx protocol, and the fastest VPN on the planet—tools that put privacy, security, and control back in people’s hands.
Your impact? Helping millions take back control of their online security, privacy, and data.
Main Responsibilities
- Acquire data from various data sources (APIs, relational and non-relational databases, queues …) by developing scripts, workflows, and ETL pipelines
- Support ETL processes and data transformations to address stakeholder requests
- Maintain existing data models’ integrity and structure on the data warehouse
- Use GitLab for version control, managing branches, creating merge requests, and conducting peer code reviews
- Monitor and analyze pipeline performance and data accuracy
- Participate in code reviews and writing unit tests to ensure high-quality solutions
- Ensure all pipelines and processes comply with internal guidelines and requirements
- Closely work with data analytics team to help them automate repetitive tasks, improve efficiency and consistency, also create ad-hoc datasets
- Discover opportunities for data acquisition, diagnostics, mapping, and correction
- Recommend and validate different ways to improve data reliability, efficiency, and quality
- Collaborate closely with cross-functional teams to ensure accurate understanding of data and business processes
Core Requirements
- Advanced proficiency in Python, experience with PySpark
- Basic knowledge of interacting with APIs (API requests)
- Familiarity with MinIO or similar environments for large-scale data workloads
- Comfortable working with GitLab (creating branches, committing code, creating merge requests, code reviews)
- Understanding of Git workflows and best practices
- Hands on experience with unit testing and code reviewing
- Experience with Apache Airflow (creating and maintaining data pipelines) or familiarity with other orchestration tools is a plus
- Passion for data analysis with focus on data quality
- Experience creating dashboards in tools like Looker or Grafana is a plus
- Proficiency in SQL, experience with BigQuery and dbt (data build tool) is a plus
- Knowledge of OpenSearch is a plus
- Ability to critically assess incoming requests, clarify requirements, and define a clear execution roadmap
- Excellent communication skills, problem-solving abilities, and a collaborative mindset
Tool You Will Use
- Python & PySpark
- Apache Airflow
- GitLab
- Looker & Grafana
Salary Range
- Monthly gross salary from 2600 - 4800 EUR.
Perks we offer
Our global mission fuels our drive. But when you need that extra push - here’s a list of things to look forward to.
Apply for this job
Infrastructure
Vilnius / Kaunas
Our values
Our values are rooted in the actions of our people. They describe how we solve problems, make decisions, and ultimately - reach our goals as a team.