Job Description
What is the opportunity?
In this role as a Lead Big Data Engineer, you will provide technical leadership and management of development deliverables for the Finance Core Data Platform. The Platform, leveraging Hadoop Big-Data technologies, serves as the central repository of finance related datasets, with capabilities including the ingestion of positional/trade, sub-ledger, general-ledger trial balances, and reference data; as well as data enrichment, adjustment, reconciliation, analytics and reporting functions.
What will you do?
- Work with Product Owner, Product Manager, and Architecture Design Authority to understand and determine best mix of development and technical solutions to meet business requirements and project objectives.
- Manage detailed design phases of project initiatives; identifying, tracking and resolving technical issues and ensuring solutions meet sponsor needs and project life cycle deliverables.
- Contribute to successful project completion by identifying risks and developing / recommending mitigation strategies.
- Research product development options and provides analysis and recommendation for product direction.
- Develop detailed plans and accurate estimates for the design, build, implementation, and maintenance phases of the project.
- Ensure adequate technical/reference documentation and training is in place.
- Manage all aspects of implementation planning and coordination activities.
- Provide assistance to the application support team in troubleshooting and resolving production issues.
- Coordinate with the Quality Engineering team on all aspects of testing and verification, ensuring quality assurance testing is performed for all changes.
- Attract and retain a team of highly skilled IT professionals including senior and junior developers. Provide leadership and direction to direct reports in creating a working environment of high expectations and commitment with focus on proactively responding with reliable, flexible and innovative technology solutions.
- Spearhead and guide people through changes as a result of transformation initiatives and align with RBC’s technical strategy.
- Impart knowledge, benefit of experience, and mentoring to other members of the team.
What do you need to succeed?
Must-have:
- 7+ years’ experience with Apache Hadoop/Hive/Spark/Scala ETL/data-pipelines or other big-data platform technology/tools (Hortonworks, Cloudera).
- 7+ years’ experience in Core Java, and RESTful web service development.
- 7+ years’ experience with SQL, Oracle/PL-SQL or any ANSI-compliant RDBMS platform preferred.
- 3+ years’ experience on Windows/Unix/Linux OS.
- 3+ years’ experience with DevOps tools/technologies, including continuous integration/delivery tools and technologies such as Subversion/GitHub, Jenkins, Nexus, JIRA, Confluence, Urban Code Deploy, Sonarqube, Checkmarx.
- 3 + years’ experience with containerization tools and technologies such as Docker, Kubernetes, Redhat OpenShift Container Platform (OCP).
- Undergraduate degree/diploma in computer science/engineering or related technology discipline.
- Solid communication and time management skills.
Nice-to-have:
- Experience with Capital Markets or other financial technology services’ middle/back office environments.
- Experience with/exposure to cloud technology platforms such as PCF (on-prem), AWS. Azure (off-prem).
- Experience with/exposure to Business Intelligence (BI)/Analytics/Reporting tools, such as Tableau, Datameer, Power BI, Presto, Snowflake, Apache Superset, etc.
- Experience with/exposure with Test driven development (TDD) and automated unit testing.
JOB SUMMARY
City: Bedford
Address: 120 Western Parkway, Bedford, NS
Work Hours/Week: 37.5
Work Environment: Office
Employment Type: Permanent
Career Level: Experienced Hire/Professional
Pay Type: Salary + Variable Bonus
Required Travel (%): 0
Exempt/Non-Exempt: N/A
People Manager: No
Application Deadline: 02/28/2022
Platform: Technology and Operations
Job ID: 58352