SOFT's client located in Multiple Locations ( Hybrid ) is looking for a Data Engineer for a long term contract assignment.
Job Profile Summary
The Data Engineer designs and develops data pipelines (data ingestion and extract, transform, load processes) that are scalable, repeatable, and secure for stakeholder needs.
This position is considered intermediate level and performs work of moderate-complexity. The incumbent works under general supervision and may take direction from a more senior team member. This job does not have any direct reports.
PRINCIPAL DUTIES AND RESPONSIBILITIES
• Creates data management strategies to support business intelligence efforts for various stakeholders.
• Builds out data architecture to support data strategy.
• Performs the design of logical data model and implements the physical database structure.
• Ensures the quality, security, and reliability of data and analytics systems and software across the organization is being followed and troubleshoots any issues that may arise.
• Constructs and implements operational data stores and data marts.
• Provides support for deployed data applications and analytical models by being a trusted advisor to Data Scientists and other data consumers by identifying data problems and guiding issue resolution.
• Creates real-time and batch enable, transform, load (ETL) data processes aligned with business needs.
• Supports optimization of data pipelines by reducing complexity, improving readability and increasing performance.
• Builds technical requirements relative to their respective domain and applicable applications, solutions, and/or systems.
• Augments data pipelines from raw Online Transaction Processing (OLTP) databases to data solution structures.
• Performs design reviews regarding code, tables, and databases regarding the overall architecture of the data warehouse.
• Documents data flow diagrams, security access, data quality and data availability across all business systems.
• Develops and maintains a query library to support recurring data requests.
EDUCATION AND EXPERIENCE
• Bachelor's degree in a related field, or commensurate specialized training, certification, or work experience
• Minimum one year of work experience
KNOWLEDGE AND SKILLS
• Intermediate understanding of data warehouse technical architectures and extract, transform, load (ETL) or extract, load, transform (ELT) processes
• Intermediate knowledge and experience with data warehouse design, data modeling, and data processing pipelines
• Knowledge of scripting languages
• Intermediate knowledge and understanding of big data tools and software
• Familiar with Continuous Integration (CI) and Continuous Delivery (CD) tools and pipelines
• Intermediate knowledge of cloud-based data platforms Intermediate understanding of data security and privacy regulations
• Intermediate understanding of software development lifecycle (SDCLC) and Agile methodologies
• Intermediate critical thinking and decision-making abilities
• Strong written and verbal communication skills
• Excellent attention to detail