HR Opportunities

Big Data Architect

Twitter Facebook
Plymouth, MN
$140,000 - $160,000
Job Type
Direct Hire
Jul 23, 2018
Job ID
Title:  Big Data Architect
Reports to:  CEO

Our Plymouth, MN-based company has been growing rapidly since 2005.  From its beginnings as a data consulting company to today, this leading-edge innovative enterprise software development company is exploding onto the national stage, providing organizations with the means to transform their big data into actionable goals.  
This position is responsible for designing, architecting and implementing Big Data processing systems capable of processing, storing and distributing data in robust Data & Analytics Solutions. Working with multi-technology and cross-functional teams and clients, this role will lead and manage the entire life cycle of our Big Data solution.   
Reporting to the Chief Executive Officer, the Big Data Architect will lead architectural decisions and provide technology leadership and direction to the organization, end-users, and system partners in early adopter production environments.

Position Activities and Tasks 
  • Architecting and evolving the Enterprise Big Data Platform to support data management, operational, reporting, and analytical systems and applications
  • Designing, installing, configuring and administering Enterprise Big Data Hadoop platform: Dev, QA, Production clusters, applications and services in both physical and virtualized environments
  • Implementing best practices to design, install and administer services to secure Big Data environments, applications and users including Kerberos, Knox and Ranger 
  • Implementing best practices to configure and tune Big Data environments, application and services, including capacity scheduling
  • Installing and configuring high performance distributed analytical applications using Enterprise Big Data platform, including commercial and open source Machine Learning frameworks
  • Managing capacity use to ensure high availability and multi-tenancy of Big Data systems
  • Performing capacity planning based on Enterprise project pipeline and Enterprise Big Data roadmaps
  • Providing technical inputs during project solution design, development, deployment, and maintenance phases
  • Advising on purchase decisions based on business requirements
  • Assisting with preparation and review of vendor SOWs 
  • Assisting and advising network architecture and datacenter teams during hardware installations, configuration, and troubleshooting
  • Leading the architecture and design of next generation Big Data platforms 
Minimum Requirements:
  • Bachelor’s degree in Computer Science or related field
  • At least 8+ years of expertise in data architecture and modeling, relational database technology, transaction processing and data warehouse design.
  • At least 5+ years of experience combining relevant Big Data and Analytics areas of expertise
  • Enterprise Business Intelligence and Analytics (e.g., SAP, Oracle, Teradata)
  • Hadoop and other Industry Big Data frameworks
  • Underlying infrastructure for Big Data solutions (Clustered/Distributed Computing, Storage, Data Center Networking)
  • Deployment of a large distributed Big Data applications 
  • Experience with established and emerging data technologies and concepts
  • Experience in leading data integration and change management
  • Ability to express complex technical concepts effectively, both verbally and in writing.
Technical Skills Required
  • Hortonworks Hadoop: HDFS, MapReduce, Hive, Hbase, Pig, Mahout, Avro, Oozie 
  • Streaming Processing: Spark, Strom, Samza
  • NoSQL: Cassandra, Hbaseor equivalent
  • ETL tools: Talend, Informatica, SSIS
  • Cluster security frameworks: Kerberos, Knox, Ranger
  • BI Reporting & Visualization tools: Tableau, Information Builders, Microstrategy, PowerBI, SSRS
  • Languages:  Java, Scala, Python, Linux/Unix or other languages
  • Cloud: AWS, IBM Softlayer, Microsoft Azure, Google Cloud
  • Any RDBMS/DWBI technologies 
  • Experience working in lean/agile environments; knowledge of Jira
  • Experience within the Credit Union /Banking industries
  • Experience with Flume/Kafka/Storm is a plus 
  • Experience with VMWare VSphere 5.1 / 5.5 is a plus
  • Solid knowledge of SSIS/SSRS/T-SQL 
Mandatory People Skills:
  • Ability to collaborate, lead, and mentor developers, other architects, stakeholders and cross-functional teams on Data & Analytics 
  • Ability to effectively prioritize and produce high quality work products together with others under pressure and within deadlines
  • Self-motivated, self-directed, and attentive to detail