Korte beschrijving van het project:

Refresh and roll out of an end-to-end Data Quality process, rules, controls en systems upgrades for the customer in a multi-department, multi-technology en multi-system environment.

Must haves

Exposure to Data Projects (Datawarehouse or datamart implementation): 5 years (Minimum 1 project with 1 year duration)
Exposure to regulation implementation: 3 years (Minimum 1 regulation in Banking or insurance or DQ)

Exposure to Lean or Six-Sigma (KPI…): 3 years
Exposure to Process of Capability Framework design or implementation: 3 years (Minimum 1 project)
Exposure to Large Organization: 5 years (familiar with multi-systems, multi-teams, large OBS…)
Exposure to Business Rules and Data Control Rules design and implementation: 5 to 10 years
Exposure to Data Profiling Platform: Full master of 1 profiling technology (SQL, SAS….) end-to-end
Exposure to Data Catalogues: 3 years + Minimum 1 Catalogue (market or home made)
Exposure to Rules set-up: 5 Years
Exposure to a Platform implemented Data Quality: Minimum 1 platform (SAS DataFlux, MS MQS…)
Exposure to Databses: 3 years (preference SQL Server or DB2, optional Oracle or others)
Exposure to SQL: 3 Years (direct querying via TOAD or SQL Management Studio) Exposure to scripting: Minimum 1 scripting language (Power shell, SAS, Linux Script Shell…)
Exposure to Regular expresions: Minimum 1 (such as RegEx…)
Exposure to ETL: 3 years
Exposure to Data Quality Principles: Minimum 15 principles (able to list, explain, example, remediate via one or multiple technologies)
Exposure to Metadata: 5 years (pure or combined 5 years where Metadata played a key role)
Exposure to Market Data Models: 5 years (Minimum fullmaster of 1 Datamodel type, optimal knowing 3)Exposure to Standardization: 3 Years
Exposure to Naming Conventions and Formatting and conforming: 3 Years Exposure to Data Mapping: 3 years ( Minimum having developed 1 Large Mapping or 5 Medium Mapplets where at least 1 mapplet is dynamic not Excel)
Exposure to Data Culture: at least 5 (In-Format, Out-Format, Collation, Encoding, Data Protocol, BridgeTable…)
Exposure to Master Data/Data Remediation: 3 years (full master of Scan/Probe, Load/ETL, Match, Map, Merge, Deduplicate, Gold, Confidence level definition, expetion list, black list, tolerance)
Exposure to Workflow Management or complex scheduling/activity planning: 3 years with minimum 1 market plaform
Exposure to SAS Visual analytics and SAS DI and SAS EG: 3 years
Exposure to SQL Server Quality Services, T-SQL Scripting, Stored Procedures: 3 years
Exposure to Software Lifecycle Development: 5 to 10 years (Preference Iterative development or DevOps, pipelining)
Exposure to Source Control: GIT, Atlasian, SVN….
Exposure to Scorecards and Metrics: 5 years
Exposure to DAMABOK: 3 years

Nice to have

Certified in SAS (Developer or Administrator) : is a strong plus Certified in SQL (Developer or Administrator) : is a strong plus Certified in Quality (ISO certification….) : is a good plus Certified in Market Regulation (any…) : is a good plus

Taken

Candidate shall be able to explain, design and implement a data quality trajectory within the company. Like in any complex process, candidate shall walk the talk and be able to:

  • Analyse our existing DQ process, Framework and Tooling Architecture
  • Identify the weaknesses and strengths, discuss his or her proposed improvements with the DMO, the Enterprise Data Architect and the CDO
  • Base his or her improvements on de identified painpoints (both functional and technical painpoints) but also on the requirements of the Business and IT sponsors and workers
  • Put in place an activity plan (either personal or a team activity plan)
  • Build or enrich the company data catalogue with its findings and consume the data catalogue to trace a data quality journey
  • Define a roadmap next to the DQ activity plan where we can clearly see the difference between the functional activities, the technical system activities (implementation) and the communication activities
  • Build a data controls base list for data criteria (such as rediness, availability, completeness….) that will serve as a common ground for general datanodes qualifications
  • Idenfify, sort and do a trage of the data quality controls and review their regular expressions
  • Identify what DQ controls shall go into the ETL vs the ones that shall go into Profilling
  • Put up a plan for ETL vs Profiling alignment in order to reach a common KPI per datanode
  • Openly raise issues and be proactive in looking for options without being rigid towards a specific approach
  • Put in place jointly with the remaining data stewards and IT workers the iterative cycle of datanodes review including: Tagging, consolidating, profiling, measuring, analysing, communicating and rectifying and then the cycle restating in a next iteration
  • Analysing the company Data Quality tools (SAS Data Flux and SQL Data Quality Services) and do the nesserary upgrades (custom development, writing regular expressions, doing triage, introducing scheduling of activities via ETL schedulers or via Streamworks)
  • Build joint scorecards with agreed upon KPI with DMO, Enterprise Data Architect, CDO, Business and IT sponsors
  • Work with a large vision but in small/short iterations
  • Reuse market best practices from regulations, ISO standards and DAMA methodology
  • Document, industrialize the activities
  • Coach Data stewards and business workers in applying remediations
  • Build and manage own bridge lists and keep version control up to date
  • Integrate into the company culture and be openminded during the collaboration
  • Be comfortable with the start-up phase and propose a written common approach in order to install cohesion around Data Quality
  • Sell data quality as an enabler of data as an asset and show tangible motivation for data
  • Introduce new idea gradually to gain buy-in and preserve stability
  • Avoid rigid mentality and people/tool/process criticizing but bring realistic proposals to the table
  • Be structured, organized
  • Leverage its own computer science, data experience, programming skills
  • Skills/ Profiel
  • Bachelor of Master in Computer Science, Operational Research, Data Science or equivalent
  • 5 years effective experience and exposure to Data Quality (extensive experience with 100% focus)
  • OR 10 years general experience mixing Data, Data Quality software engineering, process engineering with other non-data related projects (such as Six sigma, process optimization, modelling…)
  • Past experience in leading minimum 3 resources and extensively collaborating with multiple profiles, IT, Business, Seniors and Mediors

Kennis van verzekeringen: Ja

Exposure to data projects and/or data quality missions in a highly regulated industry such as insurance (minimum one of: Life, Non-Life or Corporate) or banking or any other highly regulated environment where data quality is directly linked to profit loss, government penalties, client rentetion or new market penetration sensitive strategy.

Talenkennis: Nederlands/ Engels

Job ID: 3315

Solliciteren voor deze vacature

Vul onderstaand formulier in

Allowed Type(s): .pdf, .doc, .docx