Summary of this Role
We are looking for a Big Data Engineer to play a key role in building their industry leading Customer Information Analytics Platform. Are you passionate about Big Data and highly scalable data platforms? Do you enjoy building end to end Analytics solutions to help drive business decisions? And if you have experience in building and maintaining highly scalable data warehouses and data pipelines with high transaction volumes then we need you!!!
The full stack Data Engineer will design, develop, implement, test, document, and operate large-scale, high-volume, high-performance data structures for analytics and deep learning. Implement data ingestion routines both real time and batch using best practices in data modeling, ETL/ELT processes leveraging AWS technologies and Big data tools. logical abstraction layer against large, multi-dimensional datasets and multiple sources. Gather business and functional requirements and translate these requirements into robust, scalable, operable solutions that work well within the overall data architecture. Produce comprehensive, usable dataset documentation and metadata. Provides input and recommendations on technical issues to the project manager.
Preferred Basic Qualifications
5+ years of experience with detailed knowledge of data warehouse technical architectures, infrastructure components, ETL/ ELT and reporting/analytic tools.
5+ years of work experience with very large data warehousing environment
3+ years of experience data modeling concepts
3+ years of Python development experience
2+ years’ experience in Big Data stack environments (EMR, Hadoop, Glue, Hive)
Experience with Kafka, Flume and AWS tool stack such as Redshift and Kinesis are preferred.
Experience in writing Spark ETL jobs
Experience using software version control tools (Git, Jenkins, Apache Subversion)
Demonstrated strength in architecting data warehouse solutions and integrating technical components
Good analytical skills with excellent knowledge of SQL.
Excellent communication skills, both written and verbal
Experience working with CDC solutions like Golden Gate, Syncsort, Attunity
Java Development Experience is Preferred
Experience in gathering requirements and formulating business metrics for reporting.
Experience with Kafka, Flume and AWS tool stack such as Glue ETL, Redshift and Kinesis are preferred.
Experience building on AWS using S3, EC2, Redshift, DynamoDB, Lambda, QuickSight, etc.
AWS certifications or other related professional technical certifications
Experience with cloud or on-premise middleware and other enterprise integration technologies
Relevant Experience or Degree in: Computer Science, Management Information Systems, Business or related field
Typically Minimum 6+ Years Relevant Exp
Four-year college degree and 4 or more years, and/or a high school diploma with 6 or more years professional experience in full life cycle design and development to include IT architecture, banking industry experience, and understanding client requirements