Hands on computer scientist with python expertise having experience using python in high performance and high data volume scenarios. This role will be focused on implementing various features as part of the implementation of new models.
Global Risk Analytics (GRA) technology group within Bank of America is looking for a talented Software Engineer to join our growing team. Looking for a self-starting individual who is motivated and eager to learn and grow as part of a team responsible for the development of a model execution and data management framework using Spark and Python programming. The candidate's responsibilities will include: python coding, database data structure development, data analysis using SQL (as needed), Unix Shell scripting, and documentation.
Candidates should have a sense of urgency when handling issues impacting the business. Strong problem-solving skills with a structured and repeatable approach. Ability to work with limited supervision. Good communication skills and experience with an Agile methodology. Desire to take ownership of the project or components as needed. Ability to quickly grasp the business needs and be able to adapt to the changes.
- Seeking individual with 7+ years overall experience, including strong programming experience and practical knowledge of objected-oriented software engineering
- 5+ years of solid Python programming experience, preferably with Apache spark or distributed computing experience
- Experience in developing data processing tasks using python / PySpark such as reading data from external sources, merging data, performing data enrichment and loading in to target data destinations
- Relational database / SQL experience with Oracle, MS-SQL Server, Hive-Impala, etc.
- Technical /Feature Lead experience
- Solid database development skills and familiarity with ETL concepts / design
- Experience with Agile Development, SCRUM, or Extreme Programming methodologies
- Knowledge of Banking and Finance domain. Strong problem solving, analytical and interpersonal skills. Experience working with model developers or in machine learning
- Good understanding of CI/CD tools like Jenkins SonarQube, Artifactory and Ansible. CI/CD implementation and deployment to non-production environments
- Experience in developing solutions using Hadoop technologies (Spark, MapReduce, Hive / Impala, Sqoop, Oozie, etc.) along with data integration / data security on Hadoop ecosystem
- Unix shell scripting capabilities and exposure to any Job Scheduler like AutoSys
- Strong Problem Solving and trouble shooting skills
- Experience in Test Driven Development