Posted 2 months ago
  • Design, build, and optimize scalable and reliable data pipelines to collect, process, and store large volumes of structured and unstructured data.
  • Develop ETL/ELT workflows to ingest data from various sources (APIs, databases, flat files, etc.).
  • Design and implement efficient data models for analytics and reporting.
  • Develop and maintain data warehouses, data lakes, and data marts to support business needs.
  • Monitor and improve data processing performance, ensuring scalability and reliability.
  • Optimize SQL queries, indexing strategies, and storage techniques for faster processing.
  • Ensure data security, compliance, and privacy best practices in accordance with company policies and regulations (GDPR, HIPAA, etc.).
  • Implement role-based access controls and encryption measures.
    • Experience working with cloud platforms (AWS, Azure, GCP) and data services.
    • Strong experience with ETL tools (Apache NiFi, Talend, Airflow, etc.).
  • Experience with containerization (Docker, Kubernetes).
  • Familiarity with real-time data processing frameworks.
    • Knowledge of machine learning pipelines and MLOps.

Apply For This Job

A valid email address is required.
A valid phone number is required.