#Hiring #Onsite #DataArchitect #Commercial_Real_Estate_Insurance Job Title: Data Architect - Commercial Real Estate Insurance Experience must Job Locations: US-TX-Dallas | US-IL-Chicago ( Onsite) Duration- 6 months Share resume to Jayant@tek-staffing.com Overview Currently, we are on the lookout for a seasoned #Data #Architect, specialized in commercial real estate insurance. This critical role focuses on the creation and stewardship of a sophisticated data platform, emphasizing extensive data and dimensional modeling expertise to underpin advanced risk management and insurance procurement strategies. Key Responsibilities Advanced #Data #Modeling: Develop and refine complex data models that accurately represent the nuances of insurance risk management. Employ dimensional modeling techniques to enable scalable and performance-optimized analytics. Strategic #Architecture Development: Design a data architecture that not only supports immediate needs but is also adaptable to future challenges in insurance risk assessment and mitigation. Collaboration for Innovation: Engage with risk management and insurance experts to deeply understand their data analysis needs, providing tailored data modeling solutions that enhance decision-making processes. #Governance and Data #Quality: Implement rigorous data governance frameworks to ensure the accuracy, completeness, and security of insurance and risk management data. Uphold high standards of data quality across all data assets. Platform Enhancement: Continuously assess and evolve the data architecture to incorporate emerging technologies and methodologies, ensuring the platform remains at the forefront of industry standards. Effective Communication: Serve as the key intermediary between IT, risk management, and insurance teams, facilitating clear and effective communication of data strategies and their business impacts. Desired Qualifications Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Science, or a related field. Professional Experience: Minimum of 7 years in a data architect role, with a significant focus on data and #dimensional #modeling. Experience in the insurance or risk management sector is highly preferred. Technical Skills: Demonstrated expertise in data and dimensional modeling, database management systems (SQL, NoSQL), ETL frameworks, and big data technologies. Knowledge of cloud data services (AWS, Azure, Google Cloud) is advantageous. Industry Knowledge: Deep understanding of insurance industry dynamics, risk assessment practices, and the specific data challenges encountered in this sector. #Analytical and Problem-solving: Exceptional ability to analyze complex data issues and develop innovative modeling solutions. Leadership: Demonstrated leadership qualities with a proven track record of managing projects, guiding teams, and promoting a culture of collaboration and innovation. #c2c #jobchange #lookingforjob #Benchsales
Purvi .’s Post
More Relevant Posts
-
#Hiring #Onsite #DataArchitect #Commercial_Real_Estate_Insurance Job Title: Data Architect - Commercial Real Estate Insurance Experience must Job Locations: US-TX-Dallas | US-IL-Chicago ( Onsite) Duration- 6 months Share resume to Jayant@tek-staffing.com Overview Currently, we are on the lookout for a seasoned #Data #Architect, specialized in commercial real estate insurance. This critical role focuses on the creation and stewardship of a sophisticated data platform, emphasizing extensive data and dimensional modeling expertise to underpin advanced risk management and insurance procurement strategies. Key Responsibilities Advanced #Data #Modeling: Develop and refine complex data models that accurately represent the nuances of insurance risk management. Employ dimensional modeling techniques to enable scalable and performance-optimized analytics. Strategic #Architecture Development: Design a data architecture that not only supports immediate needs but is also adaptable to future challenges in insurance risk assessment and mitigation. Collaboration for Innovation: Engage with risk management and insurance experts to deeply understand their data analysis needs, providing tailored data modeling solutions that enhance decision-making processes. #Governance and Data #Quality: Implement rigorous data governance frameworks to ensure the accuracy, completeness, and security of insurance and risk management data. Uphold high standards of data quality across all data assets. Platform Enhancement: Continuously assess and evolve the data architecture to incorporate emerging technologies and methodologies, ensuring the platform remains at the forefront of industry standards. Effective Communication: Serve as the key intermediary between IT, risk management, and insurance teams, facilitating clear and effective communication of data strategies and their business impacts. Desired Qualifications Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Science, or a related field. Professional Experience: Minimum of 7 years in a data architect role, with a significant focus on data and #dimensional #modeling. Experience in the insurance or risk management sector is highly preferred. Technical Skills: Demonstrated expertise in data and dimensional modeling, database management systems (SQL, NoSQL), ETL frameworks, and big data technologies. Knowledge of cloud data services (AWS, Azure, Google Cloud) is advantageous. Industry Knowledge: Deep understanding of insurance industry dynamics, risk assessment practices, and the specific data challenges encountered in this sector. #Analytical and Problem-solving: Exceptional ability to analyze complex data issues and develop innovative modeling solutions. Leadership: Demonstrated leadership qualities with a proven track record of managing projects, guiding teams, and promoting a culture of collaboration and innovation. #c2c #jobchange #lookingforjob #Benchsales
To view or add a comment, sign in
-
BT Hiring Alert!! Quality Control Specialist Job Req ID: 36857 Posting Date: 2 Sep 2024 Function: Data & AI Unit: Digital Location: RMZ Ecoworld, Devarabeesanahal, Bengaluru, India Salary: Competitive Closing Date: 02-Aug-2024 What you’ll be doing Drives the data governance strategy roadmap and enables its implementation. 2.Executes processes to monitor data governance policies and procedures to ensure compliance with regulatory requirements. 3.Leads the documenting and implementing of data governance policies, standards, and procedures in existing and new data environments. 4.Acts as a technical specialist on security best practices and recommends changes to enhance security and reduce risks. 5.Leads collaboration efforts with key stakeholders to ensure acceptance and sponsorship for data governance policy. 6.Facilitates the development, implementation, and business adoption of data quality standards, data protection standards and adoption requirements across the BT group. 7.Contributes to a culture of data across the BT group and leads on the implementation of various data projects. 8.Mentors other data governance professionals, helping to improve the team’s abilities by acting as a technical resource. 9.Champions, continuously develops and shares with team knowledge on emerging trends and changes in data governance. The skills you’ll need to succeed • Experience of using and applying data management best practice including use of maturity models such as DCAM/CDMC. • Experience of working on complex multi-year, multi-stream data programmes • Experience in the delivery of data management/data governance capabilities, such as metadata management, data quality management, data ownership. • Excellent stakeholder management skills • Degree level education or relevant experience as defined. • Experience of delivery using Agile methodology Experience you’d be expected to have • Experience in the following technologies: o Google Cloud Platform o AWS o Dataplex o Collibra o JIRA Our leadership standards Looking in: Leading inclusively and Safely I inspire and build trust through self-awareness, honesty and integrity. Owning outcomes I take the right decisions that benefit the broader organisation. Looking out: Delivering for the customer I execute brilliantly on clear priorities that add value to our customers and the wider business. Commercially savvy I demonstrate strong commercial focus, bringing an external perspective to decision-making. Looking to the future: Growth mindset I experiment and identify opportunities for growth for both myself and the organisation. Building for the future I build diverse future-ready teams where all individuals can be at their best. A FEW POINTS TO NOTE: Although these roles are listed as full-time, if you’re a job share partnership, work reduced hours, or any other way of working flexibly, please still get in touch. DM if you eligible & interested then share your updated resume.
To view or add a comment, sign in
-
Data Engineer @ Lyft by via ai-jobs.net ([Global] Oracle Advanced Analytics) URL: https://ift.tt/NWciOz5 At Lyft, our mission is to improve people’s lives with the world’s best transportation. To do this, we start with our own community by creating an open, inclusive, and diverse organization. Lyft thrives on community—it's the essence of who we are and what we do. Our commitment to fostering an open, inclusive, and diverse environment is paramount, ensuring every team member is valued for their unique contributions. At Lyft, data isn't just part of our decision-making process; it's the foundation. It drives our ability to deliver exceptional transportation experiences and offers insights into the impact of our product launches and features. Joining Lyft as a Data Engineer means becoming a pivotal part of a team dedicated to shaping the future of transportation. You'll be tasked with developing robust data infrastructure—encompassing data transport, collection, and storage—and providing services that enable our leadership to make informed, risk-reducing decisions. We're in search of a Data Engineer to construct a scalable insurance data pipeline that will inform our financial strategies. Collaborating closely with our claims engineering, actuarial, and science teams, as well as our insurance partners, you'll lead the charge from technical proposal to final implementation. This role involves clear communication of technical challenges to both our internal teams and external partners, ensuring a cohesive approach to problem-solving. As the steward of our core data pipeline, which underpins Lyft's key metrics, you'll leverage your data expertise to refine data models and spearhead the creation and launch of scalable data pipelines. These efforts will support Lyft's expanding needs in insurance data processing and analytics, providing critical business and user behavior insights. With access to vast amounts of Lyft data, your work will empower teams across Analytics, Data Science, Engineering, and beyond, driving innovation and growth. Responsibilities: Design and implement insurance and claim-related data pipeline to help optimize insurance operation strategies Tune ETL and MapReduce job to improve data processing performance Partner with external insurance partners and Lyft business stakeholders to ensure seamless execution within established timelines Owner of the core company data pipeline, responsible for converting the business and engineering need to efficient & reliable data pipelines Main contributor to the team roadmap and lead the team to make the right technical decisions Experience: 2+ years of relevant professional experience Strong experience with Spark Experience with Hadoop (or similar) Ecosystem, S3, DynamoDB, MapReduce, Yarn, HDFS, Hive, Spark, Presto, Pig, HBase, Parquet Strong skills in a scripting language (Python, Ruby, Bash) Good understanding of SQL Engine and able to conduct advanced perform...
To view or add a comment, sign in
-
Data & Analytics Series - Data Is A People Issue (1) It is probably not controversial to say data and analytics is a technical issue. However, in the business world, it is ultimately a people issue. Why? First, Strategy. Data strategy is an integral part of the enterprise strategy. 1. What would you like to use data & analytics to achieve? Whether it’s to improve pricing sophistication, operational efficiency, product development, risk management, or customer satisfaction, without a clear and coherent enterprise strategy, there cannot be an effective data strategy. A rule of thumb is to survey a random sample of employees, if more than 80% of the respondents can articulate the enterprise’s strategy succinctly, congratulations! 2. By “integral”, we mean that the enterprise strategy has to integrate data & analytics. What is the data impact of shifting strategic priorities? Imagine the following hypothetical scenario: A company has multiple legacy systems due to historical reasons (e.g. M&A, buy vs build decisions). Some of the data are in Oracle, some in SQL Server, some in DB2, and yet others may be in AWS or GCP, depending on the operating companies (OC). Each OC has its own but potentially overlapping product structures in different states. For example, a golf cart insurance policy may be a standalone policy, an endorsement on an auto policy or even part of a homeowner’s policy. This means at a particular state, the same product may be sold by multiple OCs but with different price, coverage, under different policies, coded with different values in different formats, and stored in different systems that may or may not talk to each other. To generate a report on the performance of this product, the data team needs to bring in data from diverse sources, clean the errors associated with each source, standardize the formats and values, and make tradeoffs where it is not possible to reconcile. And they may not even have access to each system. To add to the complexity, now the company decides to migrate one OC’s data to another system by state and start with new business. This means, this OC’s data will be simultaneously on both the old and new systems depending on whether it is new or renewal and whether a particular state has migrated. And don’t forget the “new” system also has data from its current OC that needs to be reconciled with different values and definitions. Even if we get the data together, which definitions do we report them on? Then the company decides to switch to yet another system… In this fictional example, these are not technical issues but people issues, people who make business decisions with tremendous impact on data. In today’s world, data impact has to be a mandatory step in strategic planning. (to be continued…) #data #analytics #strategy #peopleissue
To view or add a comment, sign in
-
Data Engineer @ Lyft by via all AI news ([Global] Oracle Advanced Analytics) URL: https://ift.tt/NWciOz5 At Lyft, our mission is to improve people’s lives with the world’s best transportation. To do this, we start with our own community by creating an open, inclusive, and diverse organization. Lyft thrives on community—it's the essence of who we are and what we do. Our commitment to fostering an open, inclusive, and diverse environment is paramount, ensuring every team member is valued for their unique contributions. At Lyft, data isn't just part of our decision-making process; it's the foundation. It drives our ability to deliver exceptional transportation experiences and offers insights into the impact of our product launches and features. Joining Lyft as a Data Engineer means becoming a pivotal part of a team dedicated to shaping the future of transportation. You'll be tasked with developing robust data infrastructure—encompassing data transport, collection, and storage—and providing services that enable our leadership to make informed, risk-reducing decisions. We're in search of a Data Engineer to construct a scalable insurance data pipeline that will inform our financial strategies. Collaborating closely with our claims engineering, actuarial, and science teams, as well as our insurance partners, you'll lead the charge from technical proposal to final implementation. This role involves clear communication of technical challenges to both our internal teams and external partners, ensuring a cohesive approach to problem-solving. As the steward of our core data pipeline, which underpins Lyft's key metrics, you'll leverage your data expertise to refine data models and spearhead the creation and launch of scalable data pipelines. These efforts will support Lyft's expanding needs in insurance data processing and analytics, providing critical business and user behavior insights. With access to vast amounts of Lyft data, your work will empower teams across Analytics, Data Science, Engineering, and beyond, driving innovation and growth. Responsibilities: Design and implement insurance and claim-related data pipeline to help optimize insurance operation strategies Tune ETL and MapReduce job to improve data processing performance Partner with external insurance partners and Lyft business stakeholders to ensure seamless execution within established timelines Owner of the core company data pipeline, responsible for converting the business and engineering need to efficient & reliable data pipelines Main contributor to the team roadmap and lead the team to make the right technical decisions Experience: 2+ years of relevant professional experience Strong experience with Spark Experience with Hadoop (or similar) Ecosystem, S3, DynamoDB, MapReduce, Yarn, HDFS, Hive, Spark, Presto, Pig, HBase, Parquet Strong skills in a scripting language (Python, Ruby, Bash) Good understanding of SQL Engine and able to conduct advanced perform...
Data Engineer @ Lyft by via all AI news ([Global] Oracle Advanced Analytics) URL: https://ift.tt/NWciOz5 At Lyft, our mission is to improve people’s lives with the world’s best transportation. To do this, we start with our own community by creating an open, inclusive, and diverse organization. Lyft thrives on community—it's the essence of who we are and what we do. Our commitment to fost...
aijobs.net
To view or add a comment, sign in
-
🔍 **Job Opportunity: Head of Delivery – Data Analytics - Coimbatore** We are looking for a seasoned professional to spearhead our Data Analytics division, focusing on ensuring top-notch delivery performance. The Head of Data Analytics and Delivery will play a pivotal role in shaping and executing analytics strategies, driving actionable business insights, and ensuring prompt project deliveries. This position demands a unique blend of technical proficiency, strategic acumen, and robust leadership to oversee diverse teams, stakeholders, and client interactions. 🎯 **Key Responsibilities:** 1. **Strategic Leadership:** Craft and implement the overarching Data Analytics vision aligned with organizational objectives. Identify data-driven opportunities to tackle critical business challenges and drive value creation. 2. **Delivery Management:** - Supervise end-to-end delivery of data analytics projects, emphasizing quality and timeliness. - Establish and enforce project management processes, standards, and governance. - Collaborate with cross-functional units to ensure smooth project execution. - Efficiently handle budgets, timelines, and resources to meet delivery commitments. 3. **Stakeholder Management:** - Serve as a primary contact for internal and external stakeholders, including senior management and clients. - Translate intricate data insights into actionable recommendations for business leaders. 4. **Technical Expertise:** - Offer guidance on data strategy, architecture, and infrastructure for scalability and efficiency. - Supervise the design and deployment of dashboards, reports, and analytics solutions. - Ensure compliance with data governance, security, and industry standards. 📚 **Required Skills and Qualifications:** **Education:** - Bachelor's degree in Computer Science, Data Science, Statistics, Business Administration, or related field. - MBA or equivalent advanced degree preferred. **Experience:** - 15+ years of experience, with a strong focus on data analytics within the insurance sector. - Proven track record of successfully delivering large-scale analytics projects. **Technical Skills:** Expertise in data analytics tools and technologies (e.g., SQL, Python, R, Tableau, Power BI). · Knowledge of big data platforms like Hadoop, Spark, or Snowflake is a plus. · Familiarity with machine learning and AI applications in insurance. **Domain Knowledge** · Comprehensive understanding of insurance workflows, regulatory requirements, and industry challenges. · Experience in leveraging analytics for key insurance use cases such as risk modelling, customer segmentation, and operational efficiency. Leadership and Communication · Exceptional leadership skills with the ability to inspire and guide large teams. · Strong client-facing and stakeholder management skills. · Excellent communication and presentation skills. Please share your CV to sathya_ks@outlook.com
To view or add a comment, sign in
-
*Data Management and Governance Pills - Data Quality* Once the data is protected, the organization has the necessary condition to take the next step towards managing its data: ensuring that its data is of high quality. According to the DAMA DMBoK® v.2, this is a fundamental point in data management. The data must be reliable. Otherwise, it is not possible to meet the needs of the business, either in day-to-day operations or in decision-making. In order to have quality data, it is necessary to plan and implement data quality management techniques to measure, evaluate and improve its condition so that it can then be used reliably in the organization. To help the Data Quality Analyst in this task, they can make use of (1) a set of tools, be they profiling, identification, data matching, data discovery, among others. Also, (2) it is recommended that they work closely with the main users who consume this data, so that they can extract from them their perception of the degree of quality with which the data is seen in the organization, as well as the characteristics that make the data high quality. Regulatory items such as policies and standards also support their activities. According to the DAMA DMBoK® v.2, the main benefits that an organization can obtain from using high-quality data are: ✨ Improved customer experience ✨ Increased productivity ✨ Risk reduction ✨ Ability to act on opportunities ✨ Increased revenue ✨ Competitive advantage gained from insights into customers, products, processes and opportunities Data quality management is a job that is carried out constantly and at all stages of the data lifecycle (collection, storage, processing, etc.). It also requires planning, commitment and the mindset of incorporating quality into processes and systems. Data Quality supports Data Governance by having a direct impact on the effectiveness, reliability and usefulness of information within an organization, generating accurate, reliable, consistent, complete, current data that contributes to: operational efficiency; compliance and security; decision-making; building and maintaining the organization's reputation (internally and externally). #datagovernance #dataquality #datamanagement #datamanagement #dataqualityreporting #datagovernanceanalyst #dataqualityanalyst #DAMA #DMBoK #datagovernanceprogram #regulators #strategicactive #data #education #collaboratingwithdatagovernance #dataarchitect #databankadministrator #dataengineer #datascientist #monitoring #quality #metrics #policies #standards #audit #organization #consistency #accuracy #reliability #efficiency #efficiency #processes #systems #qualitygrade #datavictorycycle #reliabledata
To view or add a comment, sign in
-
Data & Analytics Series - Data Is A People Issue (4) (Recap: First, data strategy is an integral part of the enterprise strategy. Second, an effective data organization is a flexible matrix. Third, a robust data user community is foundational to an enterprise’s success) Fourth, Accountability. Just as more than technology data security depends on people doing the right things, the responsibility for building and maintaining a reliable, efficient, and user-friendly data environment lies on more than IT and data departments. 1. Awareness. As mentioned in part one, business decisions have tremendous impact on data, which in turn will affect subsequent decision making. To reduce inconsistency and fragmentation in the data environment, integration of systems, products, and operations needs to happen as soon as possible after an M&A, unless there are strategic, legal or regulatory reasons not to do so. If integration cannot be done promptly, it calls into question the feasibility of the M&A decision in the first place. Of course, if the objective is to extract value for resale, it is another matter. Similarly, when setting up new legal entities, introducing new products, installing new systems, or upgrading existing ones, it is important to avoid increasing non-value-added complexity in data. The bottom line is that decision makers need to be acutely aware of the data impact by their decisions. 2. Standardization. Setting up and enforcing standards beforehand is a lot easier than conforming different elements afterwards. For example, I have once seen more than 400 different ways of spelling “Farmers Insurance” in our underwriting data because the “prior carrier” field was free-format text. This could have been avoided had the business and IT partners decided to use a drop-down menu with “Farmers Insurance” instead, which is more of a matter of choice than technology. 3. Process. Once my team’s predictive model stopped working mysteriously. It turns out that an upgrade to the production system changed how a variable was captured, and our team was not notified or consulted beforehand. To ensure all data stakeholders (e.g. business, data teams, IT, and others) are on the same page, include a data impact assessment in requirements for any projects. Before installing new systems or upgrading existing ones, contacting all data stakeholders to verify if there would be interruptions. If there are, have a tiered approval system to evaluate the impact and remedies. The goal is to ensure data quality at every step of a business process. In conclusion, if the objective of data and analytics is to serve people, then ultimately data is a people issue. We are all data users in one shade or another and shall take care of data together. It takes a village. This concludes series “Data is A People Issue”. I hope you find the discussion on non-technical aspects of data helpful. (The end) #data #analytics #people #accountability #technology
To view or add a comment, sign in
-
#Data #Analyst GSC’s #HSBC Closing date 🚨 17-May-2024 Protect: To protect our data we need robust policies for data management and enhanced governance e.g. data that is consumed from a trusted source, meets quality standards, and complies with regulatory obligations. Unlock: Value is unlocked when we deliver on the opportunities our well-managed data presents e.g. improving customer / colleague experience, increasing revenue, commercialising our data, optimising capital, enhancing risk management or reducing costs. As part of DAO, DAO #ESG’s vision is for the #ESG Data Utility to be the single location for all ESG data, BI and analytics delivery to meet the Group’s ESG and Sustainability ambitions, which aim to help address climate change and wider #ESG issues, to help customers and societies build the resilience needed to adapt, flourish and to open up a world of opportunity. To achieve the vision, DAO #ESG aims to implement scalable solutions to ensure #ESG data is available and accessible on time and accurately, to enable HSBC’s commercial and committed aspirations. What you’ll do: Manage the build of insights and data assets aligned to business outcomes and strategic data models Manage the gathering of detailed data requirements from stakeholders to deliver analytics use cases Manage the delivery of analytics PoCs that allow our businesses and clients to identify new opportunities Manage the leveraging of current and emerging data and analytical techniques to address business challenges in an actionable way Manage the unlocking of opportunities to derive higher returns with improved Data Quality Manage uncovering of data inconsistencies across consuming systems Adherence to the Data controls and Governance to check the movement of data Look for ways to improve the current systems and processes and automating pipelines where possible Requirements What you will need to succeed in the role: Atleast 7 years of experience with Master’s degree or equivalent from reputed university with specialization in numerical discipline and / or concentration in computer science, information systems or other engineering specializations Strong analytical skills with business analysis aptitude. Knowledge and understanding of financial-services/ banking-operations is advantageous Extensive knowledge of programming tools and hands-on programming experience Knowledge of Cloud Computing (AWS, GCP), Data Modeling, Adobe Analytics and or exposure to Data Quality assessment and control (IBM MDM, Reference Data Management) is a plus Knowledge of BI and Visualization tools like QlikSense/Tableau/Power BI Ability to comprehend intricate and diverse range of business problems and analyse them with limited or complex data and provide a feasible solution framework. What additional skills will be good to have? Broad and comprehensive understanding of concepts and principles within #ESG Good understanding of the industry and the challenges and changes in the sector.
To view or add a comment, sign in