We are seeking an experienced Senior Data Engineer to join our team of talented professionals. This position is a remote opportunity.
This company built and operates the largest and most advanced mortgage securitization platform in the world, supporting the Uniform Mortgage-Backed Security (UMBS) of Fannie Mae and Freddie Mac.
Supporting 70% of the mortgage-backed securities in the market, we provides best-in-class single-family issuance, bond administration, disclosure, and tax services. We support a broad portfolio of products for our clients with full lifecycle management.
Our market-leading, cloud-based, end-to-end platform executes transactions on an extraordinary scale which has bolstered liquidity in the secondary mortgage market, one of the largest and most important financial markets in the world. Our unique approach to securitization combines the best minds in financial services with the know-how, flexibility, and innovation of leading technologists.
- Bachelor’s degree in Computer Science or a related field
Specialized Knowledge & Skills
- Minimum of 4 years of experience in building data driven solutions.
- Applicants must be authorized to work in the US without requiring employer sponsorship currently or in the future. We do not offer H-1B sponsorship for this position.
- Expertise in real-time data solutions, good to have knowledge of streams processing, Message Oriented Platforms and ETL/ELT Tools.
- Strong scripting experience using Python
- Working knowledge of foundational AWS compute, storage, networking and IAM.
- AWS scripting experience using lambda functions and knowledge of CloudFormation is nice to have.
- Hands on experience with popular cloud-based data warehouse platforms, viz. Redshift, Snowflake.
- Experience with one or more data integration tools viz. Attunity (Qlik), AWS Glue ETL, Talend, Kafka etc.
- Strong understanding of data security – authorization, authentication, encryption, and network security.
- Experience in building data pipelines with related understanding of data ingestion, transformation of structured, semi-structured and unstructured data across cloud services
- Demonstrated ability to be self-directed with excellent organization, analytical and interpersonal skills, and consistently meet or exceed deadline deliverables.
- Demonstrated experience in data management with a strong understanding of process re/design.
- Strong communication skills to facilitate meetings and workshops to collect data, functional and technology requirements, document processes, data flows, gap analysis, and associated data to support data management/governance related efforts.
- Knowledge and understanding of data standards and principles to drive best practices around data management activities and solutions.
- Strong understanding of the importance and benefits of good data quality, and the ability to champion results across functions.
- Ability to lead collaborative meetings which result in clearly documented outcomes, a concrete understanding of meeting attendee performance/reliability, and ongoing management & follow-up for action items.
- Acts with integrity and proactively seeks ways to ensure compliance with regulations, policies, and procedures.
Responsibilities Job Information
The Data Engineer is responsible for solution engineering of enterprise scale data management best practices. This includes patterns such as - modern data integration frameworks, building of scalable distributed systems using emerging cloud-based data design patterns. This role will be responsible for developing data integration tasks in data and analytics space. This position will report to director of data management group under Data Operations organization. This is an individual performer role. Key Job Functions
- Demonstrate expert ability in implementing Data Warehouse solutions using Snowflake.
- Building data integration solutions between transaction systems and analytics platform.
- Expand data integration solutions to ingest data from internal and external sources and to further transform as per the business consumption needs
- Create security policies in Snowflake to manage fine grained access control
- Develop tasks for a multitude of data patterns, e.g., real-time data integration, Advanced Analytics, Machine Learning, BI and Reporting.
- Lead POC efforts to build foundational AI/ML services for Predictive Analytics.
- Building of data products by data enrichment and ML.
- Be a team player and share knowledge with the existing team members.