Skip to content

Data Engineer

Remote
  • Berlin, Berlin, Germany
  • Buenos Aires, Buenos Aires, Argentina
  • San Salvador, San Salvador, El Salvador
  • São Paulo, São Paulo, Brazil
  • Augsburg, Bayern, Germany
  • Austin, Texas, United States
  • Boston, Massachusetts, United States
+6 more
Engineering

Job description

TechBiz Global is a leading recruitment and software development company. Our diverse, globally distributed team provides IT recruitment, outstaffing, outsourcing, software development, and different consulting services with a primary focus on making our partners achieve their business goals successfully.

With headquarters in Germany we have successful clients all over the world. We can understand your unique needs. Our team has hands-on experience with the challenges that come with rapid growth and the IT sector. That’s why all of our offerings are built with a tech mindset.


About the role:  
The Data Engineer will collaborate closely with other Data Engineers, Software Development Engineers (SDEs), Analytics Engineers (AEs), Data Scientists (DSs) and Infrastructure Engineers (IEs) in the technology team. This role will report directly to the Director of Data  Engineering of the company.


Responsibilities include - but not limited to: 
 Collaborate closely with other Data Engineers, Software Development Engineers (SDEs), Analytics Engineers (AEs), Data Scientists (DSs) and
Infrastructure Engineers (IEs) in the technology team.
 Focus on building a modular, flexible data engineering infrastructure that we can iteratively improve on continuously. Your customers will include eCommerce, marketing, supply chain, finance,
 Business development, and legal 
 Understand that our data is a strategic resource, and will build data products to support every aspect of our business.
 Build our data model(s) in addition to our data infrastructure in collaboration with the Director of Analytics and Head of Infrastructure Engineering.

Job requirements

Requirements

Technical Expertise

 Data Architecture and Modeling:

o Experience working with data architectures for scalable and performant systems.

o Experience with dimensional data modeling, OLAP systems, and data warehousing concepts (star/snowflake schema, etc.).

o Strong experience with GCP

o Experience with Big Query, Looker Studio

o Experience with NoSQL databases (e.g., DynamoDB, Cassandra) and SQL databases (e.g., PostgreSQL, MySQL).

o Design, implement, and optimize ETL pipelines for efficient data movement and transformation.

o Hands-on experience with workflow management tools such as Apache Airflow

o Proficiency in Python and other relevant programming scripting languages for data processing.

o Strong SQL expertise for querying large datasets

o Knowledge of API integration for extracting data from multiple sources (internal and external systems such as Amazon's API for FBA, sales

data, etc.).

DevOps and Automation:

o Experience with CI/CD pipelines, automation scripts, and infrastructure-as-code tools (e.g., Terraform, CloudFormation).

o Ability to implement data versioning and data quality checks.

Data Privacy:

o Strong understanding of data privacy and compliance regulations like GDPR or CCPA.

o Ensure data governance best practices, including encryption, access controls, and audit trails.

 Monitoring and Observability:

o Expertise in implementing data quality checks, monitoring data pipelines, and ensuring that systems are reliable and efficient.

o Proficiency with monitoring tools for real-time visibility of data flows.

o Experience with ecommerce/retail/supply chain data modeling is a big

plus.

o Experience with Amazon seller ecosystem is a big plus.

Communication Skills

 Ability to solve complex data challenges, propose innovative solutions for scaling and optimizing data architecture, and proactively drive improvements.

 Ability to adapt to fast-paced, constantly evolving scale-up environment

 Capable of handling well defined projects and prioritizing based on business impact.

 Proficient in English

or