Verbinden...

W1siziisimnvbxbpbgvkx3rozw1lx2fzc2v0cy9vbgl2zxitamftzxmvanbnl29mzmljzs1izy1pbwfnzs5qcgcixv0
W1siziisimnvbxbpbgvkx3rozw1lx2fzc2v0cy9vbgl2zxitamftzxmvanbnl29mzmljzs1izy1pbwfnzs5qcgcixv0

AWS Data Architect

Titel: AWS Data Architect
Contract Type: Contract
Plaats: Baltimore, Maryland
Industrie:
Salaris: US$100 - US$130 per hour
Startdatum: ASAP
Job list.duration: 12 Months
REF: OJAAWSDA2020_1592256596
Contactpersoon: Joe Southgate
Email: Joe.southgate@ojassociates.com
Gepubliceerde vacature: 3 maanden geleden

Functieomschrijving

AWS Data Architect - Baltimore - 12 Month Contract - REMOTE

My client, a rapidly growing global Investments business, is looking for an experienced AWS Data Architect with strong technical knowledge. This will be a 12 month contract initially and completely remote.

Principal responsibilities:

  • Proactively own and drive collaboration with business and technology groups to understand business processes, document data flows and derive reference, transactional and analytical data entities of strategic importance to the company.
  • Designs strategies and programs to collect, store, analyze and model data from internal/external sources. Awareness and understanding of market data sets - ability to ingest and integrate.
  • Drive the development and implementation of data design methods, data structures, and modeling standards. Implement industry standard development policies, procedures and standards.
  • Builds, maintains and ensures our departmental conceptual and logical data models. Conduct data model assessments for strategic data entities and help data owners to address data gaps.
  • Document architecture decisions that include business needs, point of view, rationale, pros and cons.
  • Apply your expertise to help model structured and unstructured data for applications. Own these models at a high level and consult with solutions delivery teams to help them model applications.
  • Interact with development teams to align technology priorities to provide architecture directives.
  • Participate in meetings to review project designs. This will include high-level design of the overall solution and detailed design of components as needed (Operational data store, data distribution services, Data Warehousing, ETL/ELT, user interface, analysis/reporting, etc.).
  • Lead efforts to define/refine execution standards for all data architecture, mentor teams on standards and review all designs to ensure that project teams are meeting expectations for quality and conformity.
  • Regularly interact with leadership on work status, priority setting and resource allocations.
  • Research new tools and/or new architecture and review with project teams as applicable.
  • Work with support team to define methods for implementing solutions for database performance measurement and monitoring.

Minimum requirements:

Education and Work Experience:

  • A Bachelor's Degree in a technology area of study; preferably in Computer Science, MIS or Analytics
  • 10+ years equivalent work experience
  • 5+ years of experience of Big Data experience in/on AWS
  • 5+ years direct experience in Data Modeling (Conceptual, Logical, Physical Data Models) and Data Solution Development. Excellent knowledge of metadata management, data quality, data modeling, and related tools (Erwin or ER Studio or others) required.
  • 5+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL/ data ingestion protocols).
  • 5 + years of experience in ETL Development/ Architecture and ETL Tools
  • Experience in architecting data processing applications in cloud hosted infrastructures and cloud services (such as AWS, Azure etc.)
  • Experience in development and design of distributed data systems and infrastructure involving containerization, distributed parallel processing, and streaming (such as docker, Kubernetes, Spark, Python, Java, Kafka/Kinesis)
  • Experience with building streaming solutions using Spark, Apache Kafka or Kinesis
  • Experience with migration of on-prem databases to AWS Postgres, Aurora DB or Snowflake DB etc.
  • Experience in consuming and building REST API's
  • Experience with database performance tuning, memory optimization, partitioning etc.
  • Deep experience in logical, physical and semantic data modeling of structured and unstructured data
  • SQL Query development skills for analyzing and profiling data
  • Experience with multiple SDLC methodologies - Waterfall and Agile
  • Ability to clearly communicate complex technical ideas, regardless of the technical capacity of the audience
  • Strong inter-personal skills and ability to work as part of a team
  • Ability to quickly learn and adapt modeling methods from case studies or other proven approaches

If interested please respond with an updated resume.