Connecting to LinkedIn...

W1siziisimnvbxbpbgvkx3rozw1lx2fzc2v0cy9pbnrlbgxldgvjl2pwzy9iyw5uzxitzgvmyxvsdc5qcgcixv0

Solution Architect | Data & Security Startup $40m Raised | NY or MA

Job Title: Solution Architect | Data & Security Startup $40m Raised | NY or MA
Contract Type: Permanent
Location: New York
Industry:
Salary: $200,0000
Start Date: ASAP
Contact Name: Jack Wooster
Job Published: 5 days ago

Job Description

Our client is building a brand new consulting team in the USA and looking for their first Solution Architect on the ground in America. You will lead the design of customer architectures for the installation, integration and orchestration of their products and enterprise customer data.

Our client is a rapidly growing VC-backed company headquartered in London, building software to enable the safe and ethical use of valuable data for analytics and machine learning. We work with large organisations worldwide in financial services, telecommunications, pharma and government, enabling them to get the most out of data without compromising on privacy and security.

They are pioneering the new enterprise software category of Privacy Engineering to serve this emerging business need and address a social issue of growing importance. Their technology enables organisations to safely analyse and mine sensitive datasets while protecting an individual’s privacy.

Responsibilities

  • Define the delivery architecture covering products, integration with customer components, operational data flow and orchestration.
  • Develop a detailed understanding of the operation and usage of products as well as being familiar with their configuration and maintenance
  • Understand the data privacy issues and requirements (size, scope, risk issues, exposure impact), data volumes, anticipated masking complexity, performance goals and the existing operational data infrastructure (e.g. Hadoop)
  • Understand requirements for data separation, authorisation, information security controls and regulatory issues
  • Contribute to product, privacy and technical discussions during pre-sales customer engagement
  • Define the test strategy and acceptance criteria along with approaches for resilience, business continuity and backup/recovery where appropriate
  • Obtain formal agreement and sign-off for solution architecture
  • Travel to customer sites within North America, and possibly further afield, can be expected
  • Establish strong working relationships with teams in our London headquarters

Required

  • Bachelor’s degree in Computer Science or a Science or Engineering discipline
  • Proven track record of architecting sophisticated product integrations in customer enterprise environments leveraging components such Hadoop, Data Flow, enterprise security modules, RDBMS, workflow automation, ETL tools and other related technologies
  • Knowledge of Linux
  • Experience with scripting languages (e.g. shell, python, perl)
  • Experience with database schemas and SQL
  • Proven ability to deliver results under pressure with rapidly evolving propositions, client demands and business needs
  • You care deeply about customer success
  • You enjoy the variety and fast pace of a dynamic start-up, you’re flexible in your approach and comfortable with ambiguity.
  • You have a good sense of humour! and think work should be fun as well as intellectually satisfying

Desirable

  • Detailed knowledge of Hadoop architecture including primary operational components (HDFS, YARN, Kerberos, Ambari/Hue, Hive/Impala)
  • Detailed knowledge of common Data Flow / Streaming environment and technology (e.g. Apache NiFi, Kafka, Confluent, StreamSets)
  • Experience of workflow automation and Hadoop orchestration tools (Azkaban, Control- M, Oozie)
  • Experience working with typical Hadoop file formats (csv, Avro, Parquet, ORC) and compression techniques (Gzip, Bzip2, LZO, Snappy, Deflate, Sequence)
  • Knowledge of emerging big data security and auditing tools (Sentry, Ranger, Knox)
  • Operational experience of customer Hadoop deployments (Hortonworks, Cloudera)
  • Programming experience in Java, Python or similarExperience of integrating with enterprise RDBMS infrastructure
  • Experience with Amazon AWS and other cloud platforms
  • Broad knowledge of Hadoop and Linux security infrastructureExperience of integrating to LDAP-based directory services for authentication and authorisation
  • Experience of implementing solutions for resilience, business continuity and backup/recovery
  • Exposure to data warehouses and operational data storesGathering, reviewing and validating business and data requirements