This listing has expired.
We are looking to expand our partner team of Data Engineers to help us create highly-durable, highly-scalable and highly-loved data solutions for their partners. They believe in innovations, so you will only be using cutting edge technologies and constantly add to your skillset to produce outstanding results.
Tasks
- You will be involved in everything from writing custom code for data pipelines (batch and streaming)
- using ETL tools,
- implementing data warehouse models (SQL),
- supporting existing processes
- doing DevOps-y stuff on the cloud
Requirements (one or more of the following)
- Programming Languages Python, Java, Scala or Golang
- Cloud Providers AWS, GCP, Azure
- Data Pipelining Tools Snowplow, Fivetran, Dbt, Talend,
- Data Warehousing Snowflake, Redshift, Greenplum, Oracle, DB2 or similar
- Relational Databases Oracle, PostgreSQL, MySQL, Microsoft or similar
- NoSQL Databases MongoDB (Atlas), Cassandra, DynamoDB or similar
- Streaming Technologies Kinesis, Kafka or Google PubSub or similar
- Containers & Container Orchestration Docker & Kubernetes
- Workflow Orchestration Airflow, Cadence Workflow
- OS Linux, MacOS
- Version Control Git
- Big Data stack Spark, EMR, Hadoop, Hive, Presto
Our offer
- Extensive training
- Unlimited paid holiday
- Diverse projects
- Cutting-edge tech
- Team events
- Company MacBook
- Diversity
- Equal employment opportunity
- Flexible hours
- Private health insurance
[rdp-linkedin-login]