Data engineer
Job Description
The primary responsibility of Senior Data Management Engineer is to build data pipelines, model and prepare data, perform complex data analysis to answer Business questions, build and automate data pipeline and quality framework to enable and promote self service data pipelines, assist in operationalizing the AI / ML Engineering solutions. This role is expected to lead and guide other team members and evangelize the design patterns as well as coding standards.
This role plays an active part in our Data Modernization project to migrate the from on-prem platforms such as IBM Netezza to cloud project
Responsibilities:
Team up with the engineering teams and enterprise architecture (EA) to define standards, design patterns, accelerators, development practices, DevOps and CI/CD automation
Create and maintain the data ingestion, quality testing and audit framework
Conduct complex data analysis to answer the queries from Business Users or Technology team partners either directly from Analysts or stemmed from one of the Reporting tools suchs PowerBI, Tableau, OBIEE.
Build and automate the data ingestion, transformation and aggregation pipelines using Azure Data Factory, Databricks / Spark, Snowflake, Kafka as well as Enterprise Scheduler tools such as CA Workload automation or Control M
Setup and evangelize the metadata driven approach to data pipelines to promote self service
Setup and continuously improve the data quality and audit monitoring as well as alerting
Constantly evaluate the process automation options and collaborate with engineering as well as architecture to review the proposed design.
Demonstrate mastery of build and release engineering principles and methodologies including source control, branch management, build and smoke testing, archiving and retention practices
Adhere to and enhance and document the design principles, best practices by collaborating with Solution and in some cases Enterprise Architects
Participate in and support the Data Academy and Data Literacy program to train the Business Users and Technology teams on Data
Respond SLA driven production data quality or pipeline issues
Work in a fast-paced Agile/Scrum environment
Identify and assist with implementation of DevOps practices in support of fully automated deployments
Document the Data Flow Diagrams, Data Models, Technical Data Mapping and Production Support Information for Data Pipelines
Follow the Industry standard data security practices and evangelize the same across the team.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to [email protected] learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy:
Skills and Requirements
5+ years of experience in an Enterprise Data Management or Data Engineering role 3+ of hands on experience in building metadata driven data pipelines using Azure Data Factory, Databricks / Spark for Cloud Datalake 5+ years hands on experience with using one or more of the following for data analysis and wrangling Databricks, Python / PySpark, Jupyter NotebooksExpert level SQL knowledge on databases such as but not limited to Snowflake, Netezza, Oracle, Sql Server, MySQL, Teradata
3+ years of hands on experience on one or more of big data technologies such as Cloudera Hadoop, Pivotal, Vertica, MapR is a plusExperience working in a multi developer environment and hands on experience in using either azure devops or gitlab
Preferably experienced in SLA driven Production Data Pipeline or Quality support
Experience or strong understanding of the traditional enterprise ETL platforms such as IBM Datastage, Informatica, Pentaho, Ab Initio etc.
Functional knowledge of some of the following technologies - Terraform, Azure CLI, PowerShell, Containerization (Kubernetes, Docker)
Functional knowledge of one or more Reporting tools such as PowerBI, Tableau, OBIEE
Team player with excellent communication skills, ability to communicate with the customer directly and able to explain the status of the deliverables in scrum calls
Ability to implement Agile methodologies and work in an Agile DevOps environment Bachelor's degree in Computer Science or Engineering or Mathematics or related field and 5+ years of experience in various cloud technologies within a large-scale organization
Personal Attributes: Self-starter, Collaborative, Curious, Strong work ethic, highly motivated, Team oriented
Experience designing and building complex data pipelines in an agile environment
Expertise on data analysis and wrangling using sql, python, databricks
Experience with modern cloud development and design concepts; software development lifecycle; multi-developer code versioning and conflict resolution; planning, design, and problem resolution enterprise data applications / solutions
Demonstrated ability in developing a culture that embraces innovation, and challenges existing paradigms
Recommended Jobs
Environmental Program Manager, AWS Environmental, AWS EHS Compliance
DESCRIPTION AWS Infrastructure Services owns the design, planning, delivery, and operation of all AWS global infrastructure. In other words, we’re the people who keep the cloud running. We support…
Correction officer
Department: Sheriff's Department Anticipated Work Schedule: Various (24/7 Operations) Reports To: Corrections Corporal Full Time or Part Time: Full Time Regular or Temporary: Regular Ba…
Quality Engineer
Senior Quality Engineer If you would love to be part of a company that is poised for substantial growth with opportunities for career advancement, then working for CPP may be the right fit for you…
Network Solution Design Specialist
At The One 23 Group, our mission is to set the benchmark for excellence in government services. We empower our clients in the Department of Defense, Intelligence Community, and Federal Civilian sec…
Property Accountant - Hybrid
Property Accountant - Hybrid Who: An accounting professional with 2+ years of public accounting or commercial real estate experience. What: Manage general ledgers, financial reporting, budgetin…
Roofing Laborer
About The Durable Slate Company The Durable Slate Company is an award-winning slate roofing company serving the Eastern United States, with offices located throughout Ohio and Maryland. Founded in…
LEAP Services Financial Analyst
LEAP Services Financial Analyst Location West Chester, OH : Summary Responsible for creating and implementing accounting and finance processes for the LEAP engine program to transact with its primar…
Receptionist / administrative professional
What You'll Do State of Ohio Board of Pharmacy Report In Location: 77 South High Street, 17th Floor Columbus, Ohio 43215 Work hours: 8:00 a.m. - 5:00 p.m. (CURRENTLY IN-PERSON) Serves …
Engineer/Software Developer/ Algorithm Developer (Dayton, Ohio)
Full-Time - 100% onsite STARTING : Fall/Winter 2025 PAY RANGE: $110,000.00 - $190,000.00 JOB SUMMARY Are you an engineer or software developer looking for a unique opportunity to…
Director of Data and Future Technology Division, NSIC
The United States Space Force (USSF) at Wright-Patterson Air Force Base is searching for a Director of Data and Future Technology Division, NSIC (GG-0132-14). Description: We're looking for a talented…