Overview
On Site
USD 150,000.00 - 190,000.00 per year
Full Time
Skills
Team Building
Financial Services
Innovation
Finance
Extraction
Transact-SQL
Database
Analytics
Workflow
Business Intelligence
Collaboration
Usability
Microsoft SSIS
Continuous Improvement
Data Extraction
Extract
Transform
Load
SQL
PostgreSQL
Microsoft SQL Server
Data Warehouse
Snow Flake Schema
Python
Data Manipulation
Pandas
NumPy
PySpark
Microsoft Azure
Amazon Web Services
Big Data
Apache Hadoop
Apache Spark
Docker
Kubernetes
Database Design
Data Architecture
Management
API
Orchestration
Workflow Management
FOCUS
Testing
Relational Databases
Dimensional Modeling
Cloud Computing
Data Engineering
Apache Airflow
Cloud Storage
Conflict Resolution
Problem Solving
Genetics
Authorization
Law
LOS
Recruiting
Job Details
Remote - Lead Data Engineer - up to $190K base - join a team building systems to make data driven business decisions
This Jobot Job is hosted by: Chuck Wirtz
Are you a fit? Easy Apply now by clicking the "Apply Now" button and sending us your resume.
Salary: $150,000 - $190,000 per year
A bit about us:
Our client, in the financial services industry, is seeking a Lead Data Engineer to join their team. This is a full-time, direct hire, remote role that can pay $150-190K base salary plus benefits, depending on experience.
Why join us?
This role is ideal for someone who thrives in a dynamic, fast-paced environment, enjoys solving complex data problems, and is passionate about driving innovation in data engineering. If you're looking to make an impact on the financial landscape with cutting-edge data solutions this could be for you!
Job Details
Core Responsibilities:
o Lead the design and implementation of end-to-end data pipelines, from extraction (API, scraping, pyodbc) to cleansing/transformation (Python, TSQL) and loading into SQL databases or data lakes.
o Oversee the development of robust data architectures that support efficient querying and analytics, ensuring high-performance and scalable data workflows.
o Collaborate with data scientists, software developers, business intelligence teams, and stakeholders to develop and deploy data solutions that meet business needs.
o Ensure smooth coordination between engineering and other teams to translate business requirements into technical solutions.
o Guide the development of data models and business schemas, ensuring that they are optimized for both relational (3NF) and dimensional (Kimball) architectures.
o Lead the creation of scalable, reliable data models and optimize them for performance and usability.
o Develop and maintain the infrastructure for large-scale data solutions, leveraging cloud platforms (e.g., Azure) and containerization technologies (e.g., Docker).
o Lead the use of modern data platforms such as Snowflake and Fabric, ensuring their effective use in large-scale data solutions.
o Manage and optimize data pipelines using tools such as Apache Airflow, Prefect, DBT, and SSIS, ensuring that all stages of the pipeline (ETL) are efficient, scalable, and reliable.
o Ensure robust testing, monitoring, and validation of all data systems and pipelines.
o Drive continuous improvement in data engineering processes and practices, ensuring they remain cutting-edge, efficient, and aligned with industry best practices.
o Foster a culture of clean code, best practices, and rigorous testing across the team.
o Strong experience with data pipeline design and implementation, including data extraction, transformation, and loading (ETL) processes.
o Proficiency in SQL (Postgres, SQL Server) and experience with modern data warehouse solutions (e.g., Snowflake, Fabric).
o Expertise in Python for data engineering tasks, including data manipulation (Pandas, NumPy) and workflow management (Dask, PySpark, FastAPI).
o Solid knowledge of cloud platforms (Azure, AWS) and big data technologies (Hadoop, Spark).
o Hands-on experience with Docker, Kubernetes, and containerized environments.
o Strong understanding of dimensional modeling (Kimball), relational database design (3NF), and best practices in data architecture.
o Experience with API development, including building and managing API integrations.
o Proficiency with orchestration tools like Prefect or Airflow for workflow management.
o Strong focus on testing and validation, ensuring that all data systems meet reliability and performance standards.
Experience & Qualifications:
Interested in hearing more? Easy Apply now by clicking the "Apply Now" button.
Jobot is an Equal Opportunity Employer. We provide an inclusive work environment that celebrates diversity and all qualified candidates receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
Sometimes Jobot is required to perform background checks with your authorization. Jobot will consider qualified candidates with criminal histories in a manner consistent with any applicable federal, state, or local law regarding criminal backgrounds, including but not limited to the Los Angeles Fair Chance Initiative for Hiring and the San Francisco Fair Chance Ordinance.
This Jobot Job is hosted by: Chuck Wirtz
Are you a fit? Easy Apply now by clicking the "Apply Now" button and sending us your resume.
Salary: $150,000 - $190,000 per year
A bit about us:
Our client, in the financial services industry, is seeking a Lead Data Engineer to join their team. This is a full-time, direct hire, remote role that can pay $150-190K base salary plus benefits, depending on experience.
Why join us?
This role is ideal for someone who thrives in a dynamic, fast-paced environment, enjoys solving complex data problems, and is passionate about driving innovation in data engineering. If you're looking to make an impact on the financial landscape with cutting-edge data solutions this could be for you!
Job Details
Core Responsibilities:
o Lead the design and implementation of end-to-end data pipelines, from extraction (API, scraping, pyodbc) to cleansing/transformation (Python, TSQL) and loading into SQL databases or data lakes.
o Oversee the development of robust data architectures that support efficient querying and analytics, ensuring high-performance and scalable data workflows.
o Collaborate with data scientists, software developers, business intelligence teams, and stakeholders to develop and deploy data solutions that meet business needs.
o Ensure smooth coordination between engineering and other teams to translate business requirements into technical solutions.
o Guide the development of data models and business schemas, ensuring that they are optimized for both relational (3NF) and dimensional (Kimball) architectures.
o Lead the creation of scalable, reliable data models and optimize them for performance and usability.
o Develop and maintain the infrastructure for large-scale data solutions, leveraging cloud platforms (e.g., Azure) and containerization technologies (e.g., Docker).
o Lead the use of modern data platforms such as Snowflake and Fabric, ensuring their effective use in large-scale data solutions.
o Manage and optimize data pipelines using tools such as Apache Airflow, Prefect, DBT, and SSIS, ensuring that all stages of the pipeline (ETL) are efficient, scalable, and reliable.
o Ensure robust testing, monitoring, and validation of all data systems and pipelines.
o Drive continuous improvement in data engineering processes and practices, ensuring they remain cutting-edge, efficient, and aligned with industry best practices.
o Foster a culture of clean code, best practices, and rigorous testing across the team.
o Strong experience with data pipeline design and implementation, including data extraction, transformation, and loading (ETL) processes.
o Proficiency in SQL (Postgres, SQL Server) and experience with modern data warehouse solutions (e.g., Snowflake, Fabric).
o Expertise in Python for data engineering tasks, including data manipulation (Pandas, NumPy) and workflow management (Dask, PySpark, FastAPI).
o Solid knowledge of cloud platforms (Azure, AWS) and big data technologies (Hadoop, Spark).
o Hands-on experience with Docker, Kubernetes, and containerized environments.
o Strong understanding of dimensional modeling (Kimball), relational database design (3NF), and best practices in data architecture.
o Experience with API development, including building and managing API integrations.
o Proficiency with orchestration tools like Prefect or Airflow for workflow management.
o Strong focus on testing and validation, ensuring that all data systems meet reliability and performance standards.
Experience & Qualifications:
- 5+ years of experience in data engineering roles, with a proven track record of developing and maintaining data pipelines and architectures.
- Experience working with large-scale data platforms and cloud environments.
- Strong background in relational databases, dimensional data modeling, and cloud-native solutions.
- Familiarity with data engineering tools such as Apache Airflow, Prefect, and cloud storage platforms.
- Excellent problem-solving skills, with the ability to navigate complex technical challenges.
Interested in hearing more? Easy Apply now by clicking the "Apply Now" button.
Jobot is an Equal Opportunity Employer. We provide an inclusive work environment that celebrates diversity and all qualified candidates receive consideration for employment without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
Sometimes Jobot is required to perform background checks with your authorization. Jobot will consider qualified candidates with criminal histories in a manner consistent with any applicable federal, state, or local law regarding criminal backgrounds, including but not limited to the Los Angeles Fair Chance Initiative for Hiring and the San Francisco Fair Chance Ordinance.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.