Object Computing, Inc.
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Design, build, and optimize large-scale data pipelines (Azure, AWS, GCP) • Develop data ingestion and storage solutions. • Implement scalable APIs and ensure system performance. • Manage big data infrastructure and cloud deployments. • Collaborate with developers, designers, and data scientists. • Work in an Agile/DevOps environment. • ETL data engineering (Databricks, SQL Server, Snowflake, BigQuery, Apache Spark). • Proficiency in Go, Java, Python, or Scala. • Proficiency in CI/CD pipelines and Infrastructure as Code (IaC) (Terraform, CDK, Ansible). • Hands-on experience with event-driven architectures (Kafka, Pulsar). • Strong knowledge of data warehousing, SQL/NoSQL databases, and cloud platforms. • Experience with distributed computing, DevOps tools, and data governance. • Familiarity with Delta Lake, Unity Catalog, Delta Sharing, and DLT. • Degree in Computer Science, Data Engineering, or a related field, or equivalent experience. • Experience with AI/ML-driven data solutions and real-time data processing. • Expertise in building scalable APIs and integrating with modern analytics tools (Power BI, Tableau, QuickSight). • Cloud certifications (Databricks, AWS, Azure, GCP).
Responsibilities
• Design, build, and optimize large-scale data pipelines (Azure, AWS, GCP) • Develop data ingestion and storage solutions. • Implement scalable APIs and ensure system performance. • Manage big data infrastructure and cloud deployments. • Collaborate with developers, designers, and data scientists. • Work in an Agile/DevOps environment. • Core Competencies • ETL data engineering (Databricks, SQL Server, Snowflake, BigQuery, Apache Spark). • Proficiency in Go, Java, Python, or Scala. • Proficiency in CI/CD pipelines and Infrastructure as Code (IaC) (Terraform, CDK, Ansible). • Hands-on experience with event-driven architectures (Kafka, Pulsar). • Strong knowledge of data warehousing, SQL/NoSQL databases, and cloud platforms. • Experience with distributed computing, DevOps tools, and data governance. • Familiarity with Delta Lake, Unity Catalog, Delta Sharing, and DLT.