Data Systems Engineer Leadother related Employment listings - Seattle, WA at Geebo

Data Systems Engineer Lead

Responsibilities and essential job functions include but are not limited to the following:
1.
Demonstrate deep knowledge of data and the ability to lead others in the data engineering team to build and support non-interactive (batch, distributed) & real-time, highly available data, data pipeline and technology capabilities.
2.
Demonstrate focus in working towards defined business objectives and understanding the business value of work performed 3.
Demonstrate deep understanding of the ETL process (and variants there-of), including orchestration and development of data products 4.
Translate strategic requirements into business requirements to ensure solutions meet business needs 5.
Work with infrastructure provisioning & configuration tools to develop scripts to automate deployment of physical and virtual environments; to develop tools to monitor usage of virtual resources.
6.
Assist in the definition of architecture that ensure that solutions are built within a consistent framework.
7.
Lead resolution activities for complex data issues 8.
Define & implement data retention policies and procedures 9.
Define & implement data governance policies and procedures 10.
Identify improvements in team coding standards and help in implementation of the improvements.
11.
Leverage subject matter expertise to coordinate issue resolution efforts across peer support groups, technical support teams, and vendors 12.
Develop and maintain documentation relating to all assigned systems and projects 13.
Perform systems and applications performance characterization and trade-off studies through analysis and simulation 14.
Perform root cause analysis to identify permanent resolutions to software or business process issues 15.
Lead by example by demonstrating the Starbucks mission and values.
In addition:
1.
Ability to apply knowledge of multidisciplinary business principles and practices to achieve successful outcomes in cross-functional projects and activities 2.
Effective communication skills 3.
Excel at problem solving 4.
Strong working knowledge of Python, Java, Scala or C# 5.
Strong working knowledge of SQL 6.
Strong working knowledge of SQL and No-SQL Platforms 7.
Proficiency in debugging, troubleshooting, performance tuning and relevant tooling 8.
Strong working knowledge of Hadoop, YARN, MapReduce, Pig or Hive, Spark 9.
Demonstrated ability to productionalize at least 2 big data implementations 10.
Experience using one of the public cloud (AWS or Azure preferred) for data applications 11.
Proficiency in shell scripting 12.
Solid understanding of data design patterns and best practices 13.
Proficiency in CI/CD tools 14.
Proficiency in logging and monitoring tools, patterns & implementations 15.
Understanding of enterprise security, REST / SOAP services, best practices around enterprise deployments.
16.
Proven ability and desire to mentor others in a team environment 17.
Working knowledge of data visualization tools such as Tableau is a plus 18.
Practice, evangelize and be an ambassador for agile and DevOps culture 19.
Proven ability and desire to lead others in a team environment Experis is an Equal Opportunity Employer (EOE/AA).
Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.