View all jobs

Big Data Architect

Riyadh, Riyadh · Information Technology
We are looking for a Big Data Architect who can define and own end to end architecture for Big Data Analytics Solutions for the customer. The person should also drive our data analytics architecture design and implementation while engaging with Business users, business analysts, operations teams to understand their needs and create big data analytics solutions that meet the business needs.

Primary responsibility
  • Implementing a strategy for data architecture, ensuring all stakeholders are cognizant of the strategy and associated standards.
  • Providing thought-leadership to enforce data standards and realize the execution of the Data Architecture strategy.
  • Facilitating design reviews to govern adherence to the Data Architecture strategy and associated standards.
  • Articulating and managing architectural risks and issues and adhering to architectural governance.
  • Strong relationship management capabilities with an ability to engage, influence and collaborate with stakeholders across organizational boundaries enabling the delivery of high standard outcomes.
  • Set and achieved challenging short, medium, and long-term goals which exceeded the standards in their field.

  • Minimum of 8 years of experience in analytics out of which minimum 5 years in hands on implementation of data platform in Cloudera, Azure, AWS, GCP or equivalent environment.
  • Should also be well versed with Data Lake, Cloudera Hadoop distribution, Cloudera ecosystem-based solutions and frameworks.
  • Prior experience as a data architect in designing large, complex enterprise data environments, for both structured and unstructured data.
  • Expertise in data modeling, data governance, meta-data management, data security and compliance, data life cycle management, and data archiving.
  • Excellence in building Near Real Time solutions.
  • Strong understanding of Big Data Analytics platform and ETL in the context of Big Data.
  • Proficient in working with large data sets in Hadoop using HDFS, Sqoop, Hive, Spark, Stream Sets, Kafka and Apache Kudu.
Kind Regards,

Jobskey Search and Selection 

KSA Office 
-- Email: Consultant@jobskeysearch.com| Website:  www.jobskeysearch.com

Resumes@Jobskey.com | Website: www.jobskey.com 
Powered by