Data Engineer
Upload My Resume
Drop here or click to browse · PDF, DOCX, DOC, RTF, TXT
Requirements
• Bachelor’s degree in Computer Science, Information Systems, Engineering, or equivalent practical experience • 2–4 years of experience in data engineering, analytics engineering, or a related technical role • Exposure to financial services or enterprise data environments is a plus • Proficiency in Python and SQL • Hands-on experience with modern data platforms such as Databricks, Snowflake, or similar • Familiarity with cloud data services (Azure preferred) including data lakes, data warehouses, and orchestration tools • Experience working with structured and semi-structured data from APIs, SaaS platforms, and databases • Understanding of ETL / ELT concepts, data modeling, and pipeline monitoring • Exposure to BI tools such as Power BI or Tableau is a plus • Strong problem-solving skills and attention to detail • Ability to learn new technologies quickly and adapt in a fast-paced environment • Comfortable working collaboratively within cross-functional teams • Clear written and verbal communication skills, with the ability to explain technical concepts to non-technical stakeholders • General Atlantic offers a robust reward program to all employees that will support you and your family, maintaining fulfilling, secure and healthy lives now and into the future, which includes but is not limited to medical insurance, retirement savings contributions, mental and physical health resources and an equal pay program. Additional reward programs, such as annual discretionary bonuses and long-term incentive programs, are available for eligible employees and are offered as recognition for performance and one’s contributions towards the organization’s success.
Responsibilities
• Data Integration & Preparation • Support the design, development, and maintenance of cloud-based data ingestion and integration pipelines • Build and maintain ETL / ELT workflows using Python, SQL, Spark, and Databricks • Assist in integrating data from heterogeneous sources including SaaS platforms, APIs, databases, and cloud applications • Contribute to the development of reusable data integration components and frameworks • Monitor data pipelines, troubleshoot issues, and support production operations • Data as a Service • Assist in the development of centralized data services and APIs that enable downstream consumption by analytics, reporting, and application teams • Support the creation and maintenance of logical data models and service-layer abstractions • Participate in building batch and near-real-time data processing workflows • Contribute to modernization initiatives migrating legacy data processes to cloud-native solutions • Self-Service Data Platform • Help prepare curated datasets for use in data lakes, enterprise data hubs, and data warehouses • Work with senior engineers to support scalable, performant data storage and compute solutions using Databricks and cloud data services • Enable reliable data access for analytics, reporting, and data science use cases • Data Design & Development • Develop and maintain SQL objects, data models, and transformations • Write clean, maintainable Python code for data processing and orchestration • Participate in code reviews, testing, and documentation to ensure data quality and reliability • While this role description is intended to be an accurate reflection of the job requirements, Actis reserves the right to modify, add, or remove duties from particular roles and assign other duties as necessary. • Knowledge