Candidates will address Identity Access Management / Identity Fraud Gateway dashboarding and alarming used by IT, Operations, Finance and Fraud Prevention organizations built using Splunk, Oracle, Kibana and other data sources consolidating logging detail counters in Elastic and Influx DB and then exposing this data on real-time Grafana dashboards. In addition, there will be work in support of monitoring and analyzing the trends of the dashboards, enabling real-time alarming and coordinating with application teams on logging standardization.
Candidates will be required to develop and design LogStash and other jobs to consolidate application log counters in a standard methodology and output on Elastic Search and Influx DB, building process to migrate history as well as real-time data manipulation and create and organize Grafana dashboards providing business transactional details, operational and server health details, and customer aggregated funnels across channels.
Candidate has to have experience in Data Analytics, and should be able to solve complex problems and be capable of mining data from primary and secondary sources, then reorganizing said data in a format that can be easily interpreted and transformed into reporting structures.
Must Have Skills:
Experience in Data analytics
Experience with Big Data technologies
Experience in Python and R programming language
Experience in building dashboards real time and batch
Experience with Distributed Data platforms (HDFS, Elasticsearch, Splunk, Casandra)
Experience in working with NoSql and in memory databases
Experience of Kafka and experience with cross DC replication
Experience with Docker and Kubernetes
Experience with in-memory caches like Redis / Akka Distributed data
Experience with CI/CD process - GIT (Bitbucket), Jenkins, Jira, Confluence
Google drive suite services (Google Docs, Google Sheets, etc.)
Experience working with Docker images;
Experienced in application and web service development in Java / Scala
Knowledge of application secure coding standards including OWASP best practices;
Experience with LDAP (Lightweight Directory Access Protocol) with Ping or equivalent;
Aware of functional programing paradigm;
Experience working with Pub/Sub messaging systems such as Apache Kafka or Amazon Kinesis;
Experienced in Unit testing, load test and automation tools like JUnit, NUnit, Gatling, Selenium, Watir, Sitespeed;
Understanding business objectives and developing models that help to achieve them, along with metrics to track their progress
Responsible for fraud data analytics, data pipelining, point anomaly model creation and creating real time dashboards.
Mining data from primary and secondary sources, then reorganizing said data in a format that can be easily read by either human or machine.
Preparing reports for executive leadership that effectively communicate trends, patterns, and predictions using relevant data
Responsible for analyzing data and implementing end to end solutions to address frauds.
Responsible for providing key insights on a regular basis by analysing fraud and other channels logs.
BS in computer science or a related field