Job Description :
Cloudera is seeking an experienced Solutions Architect to join our team. This key role has two major responsibilities : first to work directly with our customers and partners to optimize their plans and objectives for architecting, designing and deploying Apache Hadoop environments, and, secondly, to assist in building or designing reference configurations to enable our customers and influence our product.
The Solutions Architect will facilitate the communication flow between Cloudera teams and the customer. For these strategically important roles, we are seeking outstanding talent to join our team.
Work directly with customer’s technical resources to devise and recommend solutions based on the understood requirements
Analyze complex distributed production deployments, and make recommendations to optimize performance
Able to document and present complex architectures for the customers technical teams
Work closely with Cloudera’s teams at all levels to help ensure the success of project consulting engagements with customer
Help design and implement Hadoop architectures and configurations for customer
Drive projects with customers to successful completion
Write and produce technical documentation, knowledge base articles
Participate in the pre-and post- sales process, helping both the sales and product teams to interpret customers’ requirements.
Keep current with the Hadoop Big Data ecosystem technologies.
Attend speaking engagements when needed.
More than 4+ years of Professional Services (customer facing) experience architecting large scale storage, data center and / or globally distributed solutions
Experience designing and deploying production large-scale Hadoop solutions
Ability to understand and translate customer requirements into technical requirements
Experience designing data queries against data in a Hadoop environment using tools such as Apache Hive, Apache Druid, Apache Phoenix or others.
Experience installing and administering multi-node Hadoop clusters
Strong experience implementing software and / or solutions in the enterprise Linux or Unix environment
Strong understanding with various enterprise security solutions such as LDAP and / or Kerberos
Strong understanding of network configuration, devices, protocols, speeds and optimizations
Strong understanding of the Java development, debugging & profiling
Significant previous work writing to network-based APIs, preferably REST / JSON or XML / SOAP
Solid background in Database administration and design, along with Data Modeling with star schema, slowing changing dimensions, and / or data capture.
Experience in architecting data center solutions properly selecting server and storage hardware based on performance, availability and ROI requirements
Demonstrated experience implementing big data use-cases, understanding of standard design patterns commonly used in Hadoop-based deployments.
Excellent verbal and written communications
Nice, but not required experience :
Knowledge of the data management eco-system including : Concepts of data warehousing, ETL, data integration, etc.
Hortonworks / Cloudera Certified - Admin and / or Developer or Data Science experience
Familiarity with Data Science notebooks such as Apache Zepplin, Jupyter or IBM DSX
Demonstrable experience implementing machine learning algorithms using R or Tensorflow
Automation experience with Chef, Puppet, Jenkins or Ansible
Familiarity with scripting tools such as bash shell scripts, Python and / or Perl
Experience with Cloud Platforms & deployment automation
Posted 23 Days Ago