Key job responsibilities
- Work with engineering and business stakeholders to understand data requirements
- Lead the design, model, and implementation of structured datasets
- Evaluate and implement efficient distributed storage and query techniques
- Interact and integrate with internal and external teams and systems to extract, transform, and load data from a wide variety of sources
- Implement robust and maintainable code with clear and maintained documentation
Basic qualifications:
- Degree in Computer Science, Engineering, Mathematics, or a related field
- 3+ years industry experience
- Experience in data modelling, data warehousing and building ETL pipelines
- SQL Ninja skills
Preferred qualifications
- Industry experience as a Data Engineer or related specialty (e.g., Software Engineer, Business Intelligence Engineer, Data Scientist) with a track record of manipulating, processing, and extracting value from large datasets
- Coding proficiency in at least one modern programming language (Python, Scala, Java etc.)
- 2+ years of designing ETL/ELT workflows and proficiency in scripting (writing complex SQL with PostgreSQL, Redshift or other RDB)
- A desire to work in a collaborative, intellectually curious environment
Benefits
☝️ A chance to work with one of the strongest Data Science team
💻 The newest equipment if you need it
📌 Courses, conferences, other self-development activities upon request
🏄 Cool team buildings in Ukraine and in Portugal
🌴 Paid 18 days off (can be negotiable)
👨💻 Fully remote work or hybrid
🌱 Equity