Job Summary

Our client, one of the Big Four largest professional services networks in the world is looking for a Reporting COE Engineer to work in his Data Labs. A career in their Data Labs group will provide you with the opportunity to design, build and optimize enterprise-scale capabilities that enable the firm to reimagine business processes, expand our business portfolio and enable engagement teams with technology. You will work closely with business owners to create unique client solutions that enable automation, leverage machine learning, improve data analytics and provide visual reporting. You will leverage a large array of Open Source and commercial technologies to extend our software platform delivering tools in a rapid, agile and quality enabled approach. You’ll focus on software technology innovation, continuous integration / continuous delivery processes, software product planning/management, and cloud strategy.

Responsibilities

As a Senior Reporting COE Engineer, you’ll lead task development solving problems and helping to develop complex technical solutions from design to implementation. Professional skills and responsibilities for this Engineering level include but are not limited to:

  • Work independently in data visualization and software development to contribute towards complex reporting solutions across a range of applications and use cases.
  • Lead software tool and library assessments to determine applicability as well as strengths and weaknesses.
  • Perform unit tests on developed code in a local developer environment
  • Participate in agile scrum process and following software development lifecycle practices
  • Participate in code reviews to ensure quality of team contributions.
  • Analyze complex ideas or proposals and build a range of meaningful recommendations.
  • Develop a perspective on key global trends in technology, and how they impact the firm and our clients.
  • Uphold the firm’s code of ethics and business conduct.

Qualifications

  • Minimum Degree Required: Bachelor Degree
  • Required Fields of Study: Management Information Systems, Computer and Information Science, Systems Engineering, Electrical Engineering, Chemical Engineering, Industrial Engineering, Mathematics, Statistics, Mathematical Statistics.
  • Minimum Years of Experience: 4 year(s) of related experience.
  • Demonstrates thorough knowledge and/or a proven record of success in the following areas:
    • Javascript (NodeJS) and experience with data extraction, data cleansing, and data wrangling.
    • SQL and experience with relational databases.
    • Coding of business rules (analytics) in one of the programming languages listed above.;
  • Experience working with business teams to capture and define data models and data flows to enable do

  • Data modeling, data mapping, data governance and the processes and technologies commonly used in this space.;
  • wnstream analytics.;
  • Data integration tools (e.g. Talend, SnapLogic, Informatica) and data warehousing/data lake tools.;
  • Business Intelligence Tools such as Tableau, PowerBI, Zoomdata, Pentaho, etc.;
  • Javascript based charting libraries such as D3.js, ECharts, Chart.js, HighCharts, etc. and,
  • Systems development life cycles such as Agile and Scrum methodologies.
  • API based data acquisition and management. Demonstrates thorough abilities and/or a proven record of success in the following areas:
  • Object-oriented/object function scripting languages such as Java, NodeJS, Scala, Python, R, C/C++, etc.;
  • Relational SQL, distributed SQL and NoSQL databases including, but not limited to, MSSQL, PostgreSQL, MySQL, MemSQL, CrateDB, MongoDB, Cassandra, Neo4j, AllegroGraph, ArangoDB, etc.;
  • Big data tools such as Hadoop, Spark, Kafka, etc.:
  • Data modeling tools such as ERWin, Enterprise Architect, Visio, etc.;
  • Data integration tools such as Alteryx, Talend, Informatica, SnapLogic, etc.;
  • Data pipeline and workflow management tools such as Azkaban, Luigi, Airflow, etc.:
  • Cloud technologies such as SaaS, IaaS and PaaS within Azure, AWS or Google and the associated data Page 2 of 7 Job Description Data pipeline tools.: and,
  • Linux and proven comfort level with bash scripting.
  • Docker and Puppet and agile development processes. Demonstrates thorough abilities and/or a proven record of success in the following areas:
  • Building enterprise data pipelines and the ability to craft code in SQL, Python, and/or R; - Building batch data pipelines with relational and columnar database engines as well as Hadoop or Spark, and understanding their respective strengths and weaknesses;
  • Building scalable and performant data models;
  • Applying computer science fundamentals such as data structures, algorithms, programming languages, distributed systems, and information retrieval;
  • Presenting technical and non-technical information to various audiences;
  • Transforming and analyzing large data sets and deriving insights from data using various BI and data analytics tools;
  • Thinking differently to solve complex business problems;
  • Securely handling data both in motion and at rest such as communication protocols, encryption, authentication, and authorization;
  • Working with Graph databases and graph modeling; and,
  • Working with the requirements of data science teams.