Similar Jobs
Senior Cognos Developer
Lincoln, NE, US.
Contract 1099, Contract W2, C2H 1099, C2H W2, Contract Corp-To-Corp, C2H Corp-To-Corp
Depends on experience
04/14/2026
Urgent Hiring
IBM DataStage Developer
Lincoln, NE, US.
Contract W2, Contract 1099, C2H 1099, C2H W2, Contract Corp-To-Corp, C2H Corp-To-Corp
Depends on experience
04/13/2026
Urgent Hiring
QA Tester
Lincoln, NE, US.
Contract W2, Contract 1099, C2H 1099, C2H W2, C2H Corp-To-Corp, Contract Corp-To-Corp
Depends on experience
03/30/2026
Urgent Hiring
Snowflake Data Engineer
Nashville, TN, US.
Contract 1099, Contract W2, C2H 1099, C2H W2, Contract Corp-To-Corp, C2H Corp-To-Corp
Depends on experience
03/26/2026
Urgent Hiring
Azure cloud Developer
01/20/2025
Urgent Hiring
IT Tester
10/08/2024
Urgent Hiring
Lead Python API Developer
10/02/2024
Urgent Hiring
Lead Java Developer
08/28/2024
Urgent Hiring
Data Scientist/Machine Learning Engineer
Roswell, GA.
Contract W2, C2H W2, Contract Corp-To-Corp, C2H Corp-To-Corp
Depends on experience
08/28/2024
Urgent Hiring
Open Stack Developer
08/01/2025
Urgent Hiring
Fullstack Developer
02/27/2023
Cloud Data Engineer
Lincoln, NE, US.
Urgent Hiring
C2H W2
Contract 1099
C2H 1099
Contract W2
Contract Corp-To-Corp
C2H Corp-To-Corp
Job Alerts:
NoYes
Save
Posted on:03/30/2026
184
Applied:2
Similar Jobs
Depends on experience
Cloud Data Engineer,Databricks,ETL,Python,SSIS
Work type: Remote
Work authorization: US local resources,Work authorization required,Work authorization sponsored,Work authorization in field of study
Duration: 1 year 7 months
Veteran Service: Yes
Number of hirings: 1
Job Details
Summary:
The State of Nebraska Department of Health and Human Services (DHHS) is seeking a skilled Cloud Data Engineer to join the Data Office Team in driving the modernization of enterprise analytics. This role will focus on building scalable, high-performance data pipelines and models using modern cloud technologies such as Azure, Databricks, Snowflake, SQL, Python, Scala, and PowerBI. The ideal candidate will bring a strong foundation in data engineering, a passion for solving complex data challenges, and experience working in agile environments.
Key Responsibilities:
• Design and develop data pipelines and ELT processes to integrate large, diverse datasets from multiple sources.
• Build and maintain data models and structures to support enterprise reporting and analytics.
• Collaborate with cross-functional teams to deliver BI and analytics solutions that meet business needs.
• Optimize data performance by troubleshooting and resolving issues related to large-scale data querying and transformation.
• Participate in the design and documentation of data processes, including model development, validation, and implementation.
• Contribute to a positive data safety culture by adhering to data governance and security policies.
Core Competencies:
• Data Structures & Modeling: Design and implement scalable data architectures for structured and unstructured data.
• Data Pipelines & ELT: Develop robust extraction, transformation, and loading processes using modern tools and frameworks.
• Performance Optimization: Monitor and enhance data performance during development and production.
Required Qualifications:
• Bachelor’s degree in Data Analytics, MIS, Computer Science, or a related field.
• 5+ years of experience in data engineering or data warehouse development, including dimensional modeling.
• 5+ years of experience designing and developing ETL/ELT processes using tools like SSIS, Databricks, or Python.
• We're looking for a self-motivated team member who can work independently in a fully remote Agile environment - someone who takes ownership of their work, communicates proactively, and drives progress without needing close supervision.
• The ideal candidate thrives in a distributed Scrum team, managing their own deliverables and staying accountable through clear, consistent communication.
The State of Nebraska Department of Health and Human Services (DHHS) is seeking a skilled Cloud Data Engineer to join the Data Office Team in driving the modernization of enterprise analytics. This role will focus on building scalable, high-performance data pipelines and models using modern cloud technologies such as Azure, Databricks, Snowflake, SQL, Python, Scala, and PowerBI. The ideal candidate will bring a strong foundation in data engineering, a passion for solving complex data challenges, and experience working in agile environments.
Key Responsibilities:
• Design and develop data pipelines and ELT processes to integrate large, diverse datasets from multiple sources.
• Build and maintain data models and structures to support enterprise reporting and analytics.
• Collaborate with cross-functional teams to deliver BI and analytics solutions that meet business needs.
• Optimize data performance by troubleshooting and resolving issues related to large-scale data querying and transformation.
• Participate in the design and documentation of data processes, including model development, validation, and implementation.
• Contribute to a positive data safety culture by adhering to data governance and security policies.
Core Competencies:
• Data Structures & Modeling: Design and implement scalable data architectures for structured and unstructured data.
• Data Pipelines & ELT: Develop robust extraction, transformation, and loading processes using modern tools and frameworks.
• Performance Optimization: Monitor and enhance data performance during development and production.
Required Qualifications:
• Bachelor’s degree in Data Analytics, MIS, Computer Science, or a related field.
• 5+ years of experience in data engineering or data warehouse development, including dimensional modeling.
• 5+ years of experience designing and developing ETL/ELT processes using tools like SSIS, Databricks, or Python.
• We're looking for a self-motivated team member who can work independently in a fully remote Agile environment - someone who takes ownership of their work, communicates proactively, and drives progress without needing close supervision.
• The ideal candidate thrives in a distributed Scrum team, managing their own deliverables and staying accountable through clear, consistent communication.
Interview type: Telephonic, Virtual
Experience years required: 0



