Why is our BI Data Development team the next step for you?
Over the coming year there will be a significant data transformation project (using big data technologies) within Worldpay. This enabling project will require a large step-change in the way BI and Analytics data is sourced, maintained and distributed. The primary purpose of this team is to assist in developing the data structures (both physical and logical) to facilitate this change and to take advantage of new data capabilities.
This team mainly works on creating the physical and logical data layers that is the key consolidated data sources for analysis across the business. They create and maintain the transformation and data manifestation steps that are the vehicle for the above layers. And during this transition to Big Data platforms, members will be supporting and maintaining the existing data processes, re-writing and improving them occasionally to deliver a faster, more accurate base on which to undertake analysis and reporting.
How will you add value on a day-to-day basis?
You will be utilising emerging technologies and capabilities (specifically within the Hadoop sphere of services), creating a suite of physical or virtual data structures as required by the reporting and analytics teams. You would be working specifically with Hive and Ambari-Views to create a Hadoop driven data-warehouse as a middle layer between an HDFS ‘data lake’ and front end tools (e.g. tableau, R, python, excel etc).
You would also consolidate a wide range of source information from complementary systems and platforms to a single unified data environment. You would also operationalise and automate a wide array of manual and semi-manual data tasks. Thereby, developing best practice and ways of working and disseminate these amongst the wider team.
You will be the point of reference for technical excellence within the wider user community to facilitate the team’s understanding of the data environment, processes and procedures within Worldpay BI Development.
What will make you the ideal candidate?
A proven track record of applying data processes to overcome difficult business processes. You must have worked with SQL Server in both a production and Data Warehousing Capacity along with logical and physical data layer construction/architecture. Having worked on SQL Server in both a production and Data Warehousing capacity is also needed. Working with Kimball methodologies is important. You will have the ability to work ‘backwards’ through a data supply chain to reverse engineer data pedigree and isolate flawed manipulations and have Big Data (Hadoop) experience.
You will also have the technical experience which includes but not limited to SQL Server, Oracle and other RDMS and on the Data layer architecture and design. You will also possess experience of documentation and diagrammatic representation tools (Visio and similar). You must have worked om In memory analytics (one of: cognos/ssas/cubes etc) and SSIS/Batch Files other 3rd party ETL tools e.g. ODI. Having worked on Hadoop, Pig, Sqoop, HBase, Hive, Ambari Views is highly desirable. You experience may also include Tableau or other data visualisation skills.
How is Worldpay changing the world?
We are leaders in modern money. Each and every time you use your debit card or credit card to pay for something, whether online or face-to-face, there’s a good chance it happened because of us. On an annual basis our innovations, systems and technology enable billions of money transactions globally. Working with customers large and small, we help them to take your payments quickly, safely and reliably, allowing them to grow their businesses and making your life more convenient in the process. As a leader in global fintech and the largest London IPO since 2011, this is a great time
London / England
Tech and Engineering
January 31, 2018