wagey.ggwagey.gg
Open Tech JobsCompaniesPricing
Log InGet Started Free
Jobs/Data Engineer Role/Object Computing, Inc.

Object Computing, Inc.

Object Computing, Inc.Hybrid - Asia-Pacific *1mo ago
RemoteAPACCloud ComputingData AnalyticsLogisticsData EngineerDBAApplied ScientistEnterprise ArchitectGoJavaCDKPythonScala

Upload My Resume

Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT

Apply in One Click

Requirements

• Design, build, and optimize large-scale data pipelines (Azure, AWS, GCP) • Develop data ingestion and storage solutions. • Implement scalable APIs and ensure system performance. • Manage big data infrastructure and cloud deployments. • Collaborate with developers, designers, and data scientists. • Work in an Agile/DevOps environment. • ETL data engineering (Databricks, SQL Server, Snowflake, BigQuery, Apache Spark). • Proficiency in Go, Java, Python, or Scala. • Proficiency in CI/CD pipelines and Infrastructure as Code (IaC) (Terraform, CDK, Ansible). • Hands-on experience with event-driven architectures (Kafka, Pulsar). • Strong knowledge of data warehousing, SQL/NoSQL databases, and cloud platforms. • Experience with distributed computing, DevOps tools, and data governance. • Familiarity with Delta Lake, Unity Catalog, Delta Sharing, and DLT. • Degree in Computer Science, Data Engineering, or a related field, or equivalent experience. • Experience with AI/ML-driven data solutions and real-time data processing. • Expertise in building scalable APIs and integrating with modern analytics tools (Power BI, Tableau, QuickSight). • Cloud certifications (Databricks, AWS, Azure, GCP).

Responsibilities

• Design, build, and optimize large-scale data pipelines (Azure, AWS, GCP) • Develop data ingestion and storage solutions. • Implement scalable APIs and ensure system performance. • Manage big data infrastructure and cloud deployments. • Collaborate with developers, designers, and data scientists. • Work in an Agile/DevOps environment. • Core Competencies • ETL data engineering (Databricks, SQL Server, Snowflake, BigQuery, Apache Spark). • Proficiency in Go, Java, Python, or Scala. • Proficiency in CI/CD pipelines and Infrastructure as Code (IaC) (Terraform, CDK, Ansible). • Hands-on experience with event-driven architectures (Kafka, Pulsar). • Strong knowledge of data warehousing, SQL/NoSQL databases, and cloud platforms. • Experience with distributed computing, DevOps tools, and data governance. • Familiarity with Delta Lake, Unity Catalog, Delta Sharing, and DLT.

Similar Jobs

Cloud Solutions Engineer2h ago
FueledFueled·Remote - USA
RemoteNAPaymentsCloud ComputingSolutions EngineerCloud EngineerGoRESTPythonGraphQLNode.jsAWSAzureGCPJMeterArtilleryDatadogTerraformShellCDKDocumentation
Engineering Manager - Europe2h ago
plaidplaid·London
In OfficeEMEAStaffBankingFintechEngineering ManagerTech LeadTeam LeadershipPlaidData QualityCoachingGoRails
Senior Professional Services Consulting Engineer5h ago
RedisRedis·Remote - United Kingdom
RemoteEMEASeniorSoftwarePublic SectorSolutions EngineerRedisDockerKubernetesTeam ManagementJavaC#PythonGo.NETLinux
Lead Data Engineer at LightWork AI5h ago
UnknownUnknown·London UK
In OfficeEMEAStaffArtificial IntelligenceData EngineerRecruiterPythonPostgreSQLNoSQLVectorQdrantMilvusGovernanceMentoringAirflowKafkaJotai
Risk and Controls Assurance Analyst 5h ago
Pay.UKPay.UK·London·$28k – $28k/year
In OfficeEMEAPaymentsQA AnalystGo

Stop filling. Start chilling.Start chilling.

Get Started Free

No credit card. Takes 10 seconds.

© 2026 Dominic Morris. All rights reserved.·Privacy·Terms·