Dataops Engineer

hace 2 semanas


Desde casa, México CODIGOMX A tiempo completo

**DataOps Engineer (AWS) JOB-32961**

**The DataOps Engineer** will be responsible for designing, implementing, and maintaining the infrastructure and processes for managing data flow, data quality, and data access across the organization. This includes working with cross-functional teams to identify, design and implement data solutions to meet business requirements.

**Key Responsibilities**:

- Design, implement, and maintain the data pipeline infrastructure that ingests, stores, and processes large amounts of data.
- Develop and maintain data quality checks, error handling procedures and recovery processes.
- Ensure the reliability and performance of the data pipeline infrastructure and support its operation.
- Collaborate with cross-functional teams to identify, design, and implement data solutions to meet business requirements.
- Design and implement security and data access controls to ensure the privacy and protection of sensitive data.
- Troubleshoot and resolve technical issues with the data pipeline infrastructure.
- Document the data pipeline infrastructure, processes, and procedures for reference and training purposes.
- Stay up-to-date with emerging trends and technologies in the field of data engineering and data operations.
- Utilize DevOps tools such as Terraform to automate infrastructure deployment and management.

**Requirements**:

- Bachelor's degree in Computer Science, Information Systems, or a related field.
- 5+ years of experience in data engineering, data operations, or a related field.
- Strong understanding of data pipeline design and implementation using technologies such as Apache Kafka, Apache Spark, or similar.
- Experience with data storage technologies such as Apache Cassandra, Apache HBase, or similar.
- Knowledge of data quality, error handling and recovery processes.
- Experience with cloud-based solutions such as AWS, Google Cloud, or Microsoft Azure.
- Proficiency in using DevOps tools such as Terraform for infrastructure deployment and management.
- Excellent problem-solving and analytical skills.
- Strong written and verbal communication skills.
- Ability to work in a fast-paced, dynamic environment.

**Must Have Skills**
- Kafka
- Data Modelling (Snowflake, Informatica)
- Data Warehousing
- CI/CD
- CDP
- AWS Lambda
- Streaming data
- Terraform
- Iceberg
- Kubernetes
- Alert monitoring experience like Cloudwatch
- Grafana
- Several monitoring tools
- Docker
- Jenkins

**Requirements & Notes**

Kafka, Data Modelling (Snowflake, Informatica), Data Warehousing, CI/CD, CDP, AWS Lambda, Streaming data, Terraform, Iceberg, Kubernetes, Cloudwatch or Grafana, Docker, Jenkins

**Must have skills**
- Docker
- Kubernetes
- Lambda
- Kafka
- Grafana
- Data Warehousing

**Nice to have skills**
- Jenkins
- Apache Spar

Tipo de puesto: Tiempo completo

Salario: $70,000.00 - $80,000.00 al mes

Horario:

- Lunes a viernes
- Turno de 8 horas

Prestaciones:

- Horarios flexibles
- Trabajar desde casa

Lugar de trabajo: Empleo remoto


  • Cloud Data Engineer

    hace 4 semanas


    Desde casa, México Intuitive.Cloud A tiempo completo

    As a Sr. Cloud Data Infrastructure Engineer with Intuitive, you will be responsible for building or converting legacy data pipelines from legacy environments to modern cloud environments to help the analytics and data science initiatives across our enterprise customers. You will be working closely with SMEs in Data Engineering and Cloud Engineering, to...