Role: Data Architect Location: Jersey City / Boston Hybrid (onsite)
Fulltime/Contract Job Description: We are seeking a Data Architect to help drive our data strategy for Private Banking and Investment Management business lines. This individual must have experience working on modern data platforms with the capabilities of supporting bigdata, relational/Non-relational databases, data warehousing, analytics, machine learning and data lake. Key responsibilities will include developing and migrating off our legacy Oracle Data Warehouses to a new data platform as the foundation for a key set of offerings running on Oracle Exadata and Cloudera's distribution technology. Data Architect will continue to support, develop, and drive the data roadmap supporting our system and business lines.
Primary Skill: Snowflake,Spark,Kafka and SQL Queries
Key Responsibilities Include: • Participate in strategic planning and contribute to the organization's data strategy and roadmap.
• Completely understand the current DW systems and user communities' data needs and requirements.
• Define Legacy Data Warehouse migration strategy. Understand existing target platform and data management environment.
• Build the Facilitate the establishment of a secure data platform on OnPrem Cloudera infrastructure .
• Document and develop ETL logic and data flows to facilitate the easy usage of data assets, both batch and real-time streaming.
• Migrate, operationalize and support of the platform.
• Manage and provide technical guidance and support to the development team, ensuring best practices and standards are followed.
Qualifications for your role would include: - Bachelor's degree in computer science or related technical field, or equivalent experience
- 10+ years of experience in an IT, preliminary on hands on development
- Strong knowledge of architectural principles, frameworks, design patterns and industry best practices for design and development.
- 6+ years' real DW project experience
- Strong hands-on experience with Snowflake
- Strong hands-on experience with Spark
- Strong hands-on experience with Kafka
- Experience with performance tuning of SQL Queries and Spark
- Experience in designing efficient and robust ETL/ELT workflows and schedulers.
- Experience working with Git, Jira, and Agile methodologies.
- End-to-end development life-cycle support and SDLC processes
- Communication skills - both written and verbal
- Strong analytical and problem-solving skills
- Self-driven, work in teams and independently if required.
Nice To Have:
• Working experience with Snowflake, AWS/Azure/GCP
• Working experience in a Financial industry is a plus
Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.