Hi , Greetings of the day!!!!! Hope you are doing good, Urgent need for "Data Architect" Day 1 onsite position, If interested, please send me your updated resume at naveen@epacetech.com or Phone 281-617-3102 Role: Data Architect Client: VISA Inc., Location: Foster City, CA Job Description: Develop and lead VDC team workshops, planning sessions, and post-mortems • Source and work with Visa data SMEs to provide business terms for identified fields in the VDC • Migrate business term glossary from Excel to the VDC • Map business terms to physical fields in VDC • Coordinate metadata tagging of business terms to VDC with VDC team and/or partnered Visa teams • Evolve Gen AI model development to automatically generate business terms for undefined fields • Streamline API mapping process • Create milestones and drive VDC integration into various third-party systems (e.g. Denodo) and Visa data applications • Draft operating manual procedures to guide internal VDC team • Draft desktop procedures for Visa data community • Map sensitive PI to the catalog in partnership with Data Privacy team • Manage Jira Align and Jira Project Management workflows and strategic planning • Forward planning of requirements and workflows for upcoming quarters • Provide data community updates • Provide trainings to Data Stewards and data user community • Support Data Criticality Model build and integration to support risk ratings of key data assets • Support data governance efforts through the VDC and its partners • Ongoing management of VDC content tracking, workflow oversight and reporting • Ongoing product management for all items above Candidate will be responsible for driving Visa’s VDC maturity within the Product and Technology Business Units through: • Define and document the VDC project charter, scope, objectives, deliverables, milestones, risks, assumptions, and dependencies. • Develop and maintain the VDC project plan, schedule, and resource allocation. • Coordinate and communicate with the Data Governance Sr. Director, Data Lifecycle Director, the Data Catalog Team, and other stakeholders on the project status, issues, risks, and changes. • Facilitate and lead VDC project meetings, workshops, reviews, and presentations. • Manage the project quality, scope, and change control processes. • Identify and resolve any project issues, conflicts, or challenges. • Monitor and report on the project performance, progress, and outcomes. • Deliver the project deliverables and documentation to the Data Lifecycle Director and other stakeholders. • Conduct the project closure and lessons learned activities on a quarterly basis. • Collaboration across cross-functional groups The successful candidate must have: • 7 + years relevant work experience • 3+ years of metadata management #data architect, #metadata, # VDC, #data catalog, #c2c, #c2c requirements #W2
Naveen Annam’s Post
More Relevant Posts
-
Data Integration: A Solutions Architect's Journey As a Solutions Architect, my role often involves grappling with diverse challenges across the realm of IT. However, nothing quite prepared me for the intricacies involved in interfacing data. A task I have performed many times before therefore a routine task that soon unfolded into a multifaceted endeavor, demanding a comprehensive array of skills and expertise. At the forefront of this endeavor is the task of seamlessly integrating data, a process that requires a delicate balance of technical proficiency and strategic planning. As the orchestrator of this integration, I found myself not only managing but also directly involved in every step of the process. From coding bespoke programs to utilizing standard BAPIs for data transportation, and meticulously mapping data for the receiving system, each task demanded hands-on attention. The complexity deepened as the data traversed through various functional areas within the modern ERP system. Drawing upon my extensive knowledge of Planning, Procurement, Finance, and other interconnected domains, I navigated through the layers of complexity with precision and diligence. Yet, amidst the technical intricacies, I never lost sight of the ultimate goal: delivering an exceptional end-user experience. While the technical requirements are paramount, I am unwavering in my commitment to ensuring that every aspect of the integration serves to enhance the user experience. It is this dual focus that drives my approach—a relentless pursuit of technical excellence coupled with a keen understanding of the human element. Throughout the journey, I remain cognizant of the importance of aligning the integration with current and future business processes. It is not merely about executing tasks but about driving tangible value for the organization. Efficiency and profitability are key metrics, and I am dedicated to ensuring that the integration contributes meaningfully to both. As I navigate through the complexities of data integration, I am reminded of the importance of adaptability and resilience. Challenges may arise, but it is through perseverance and ingenuity that solutions are forged. Every obstacle presents an opportunity for growth and innovation. In conclusion, the journey of a Solutions Architect in interfacing data is one marked by challenges and triumphs. It requires a diverse skill set, unwavering determination, and a relentless focus on delivering results. Yet, amidst the complexity, there lies the satisfaction of knowing that each endeavor brings us one step closer to achieving our goals. And for me, that is what makes this journey truly rewarding. Whether I've enjoyed the experience is probably akin to asking me how I felt when I crossed the finish line of the London Marathon in 2016. I will reserve my judgement for now....
To view or add a comment, sign in
-
Urgent Demand: Title: Data Architect Location: Worcester, Massachusetts (Hybrid) Duration: Permanent / FTE Summary: Highly accomplished Principal Architect with 10 years of experience in developing and implementing enterprise-wide architecture strategies, resulting in significant improvements in system reliability and operational costs. Proven track record in leading cross-functional teams and mentoring architects to drive team efficiency and project success rates. Skilled in identifying emerging technologies and trends to improve system performance and customer satisfaction. This role is a hybrid role located in Webster, Massachusetts. Responsibilities: Backend Architecture: Knowledge and ability to design and develop the internal architecture of cloud and on-premises solutions, and the tools and programming languages necessary to ensure the correct functioning of all internal elements Integration of systems and technologies: Knowledge of the characteristics and facilities of the systems and the ability to integrate and communicate between cloud and on-premises applications, databases, and technological platforms. IT standards, procedures, and policies: Knowledge and ability to use a variety of administrative skill sets and technical knowledge to manage the organization's IT policies, standards, and procedures. Platform Technology: Knowledge and ability to request, propose, review, recommend modifications, recommend adoption, and maintain cloud and on-premises technology specifications in pursuit of established objectives in terms of modelling technologies. Cloud computing architecture: Knowledge and ability to design, evaluate and improve cloud computing architecture to better support the requirements in cloud computing services. Data architecture: Knowledge and ability to design blueprints on how to integrate data resources for business processes and functional support. Information security architecture: Knowledge of cloud tools and techniques used to create application software, hardware, networks, and infrastructure; ability to meet information security objectives while using them. Enterprise IT architecture: Knowledge and ability to create and develop the guiding principles and business architecture of a company, including organizational structure, process architecture, and performance management, as well as information, cloud applications and supporting technology. Devops: Knowledge of DevOps practices and tools that lead to a reduction in the time to deliver the solution, a better adaptation to the market and competition, improvement of the stability and reliability of the system and an improvement of the average recovery time. Event Driven Architecture: Knowledge of the event-driven architecture (EDA) design paradigm and the software components that are executed to respond to the receipt of one or more event notifications.
To view or add a comment, sign in
-
𝗖𝗵𝗼𝗼𝘀𝗶𝗻𝗴 𝗬𝗼𝘂𝗿 𝗔𝗿𝗰𝗵𝗶𝘁𝗲𝗰𝘁𝘂𝗿𝗲 𝗦𝗽𝗲𝗰𝗶𝗮𝗹𝘁𝘆: 𝗔𝗻𝗮𝗹𝘆𝘇𝗶𝗻𝗴 𝘁𝗵𝗲 𝗢𝗽𝘁𝗶𝗼𝗻𝘀 Deciding on a specialty takes 𝗲𝘅𝗽𝗲𝗿𝗶𝗲𝗻𝗰𝗲 𝗮𝗻𝗱 𝘂𝗻𝗱𝗲𝗿𝘀𝘁𝗮𝗻𝗱𝗶𝗻𝗴 𝘁𝗵𝗲 𝗹𝗮𝗻𝗱𝘀𝗰𝗮𝗽𝗲. Enterprise, solution, software, or data architecture have requirements, opportunities, and long-term career prospects. Here are factors to consider when specializing: 𝟭 | 𝗦𝗸𝗶𝗹𝗹𝘀 𝗥𝗲𝗾𝘂𝗶𝗿𝗲𝗱 𝗳𝗼𝗿 𝗘𝗮𝗰𝗵 𝗗𝗼𝗺𝗮𝗶𝗻 - 𝙀𝙣𝙩𝙚𝙧𝙥𝙧𝙞𝙨𝙚 𝘼𝙧𝙘𝙝𝙞𝙩𝙚𝙘𝙩: Requires business strategy skills, technical knowledge, and the ability to align IT with business goals. Key skills: strategic thinking, communication, and leadership. - 𝙎𝙤𝙡𝙪𝙩𝙞𝙤𝙣 𝘼𝙧𝙘𝙝𝙞𝙩𝙚𝙘𝙩: Focus on technical solution design meeting business requirements. Strengths: system design, problem-solving, and technical expertise. - 𝙎𝙤𝙛𝙩𝙬𝙖𝙧𝙚 𝘼𝙧𝙘𝙝𝙞𝙩𝙚𝙘𝙩: Technical engineering to create scalable and efficient software. Requires deep knowledge of programming languages, software design patterns, and development methodologies. - 𝘿𝙖𝙩𝙖 𝘼𝙧𝙘𝙝𝙞𝙩𝙚𝙘𝙩: Involves organizing and managing data structures. Requires a solid understanding of database management, data modeling, and big data technologies. 𝟮 | 𝗝𝗼𝗯 𝗠𝗮𝗿𝗸𝗲𝘁 𝗮𝗻𝗱 𝗗𝗲𝗺𝗮𝗻𝗱 - 𝙀𝙣𝙩𝙚𝙧𝙥𝙧𝙞𝙨𝙚 𝘼𝙧𝙘𝙝𝙞𝙩𝙚𝙘𝙩𝙪𝙧𝙚: larger organizations to align IT with business strategy. - 𝙎𝙤𝙡𝙪𝙩𝙞𝙤𝙣 𝘼𝙧𝙘𝙝𝙞𝙩𝙚𝙘𝙩𝙨: IT services, consulting, and technology companies to bridge business needs and technical solutions. - 𝙎𝙤𝙛𝙩𝙬𝙖𝙧𝙚 𝘼𝙧𝙘𝙝𝙞𝙩𝙚𝙘𝙩𝙨: in tech companies where software scalability and performance are essential. - 𝘿𝙖𝙩𝙖 𝘼𝙧𝙘𝙝𝙞𝙩𝙚𝙘𝙩𝙨: organizations that rely on data-driven decision-making and big data technologies. 3 | 𝗟𝗼𝗻𝗴-𝗧𝗲𝗿𝗺 𝗖𝗮𝗿𝗲𝗲𝗿 𝗢𝗽𝗽𝗼𝗿𝘁𝘂𝗻𝗶𝘁𝗶𝗲𝘀 Each specialty offers different career paths: - 𝙀𝙣𝙩𝙚𝙧𝙥𝙧𝙞𝙨𝙚 𝘼𝙧𝙘𝙝𝙞𝙩𝙚𝙘𝙩𝙨 move into CIO or CTO roles, given their focus on business strategy. - 𝙎𝙤𝙡𝙪𝙩𝙞𝙤𝙣 𝘼𝙧𝙘𝙝𝙞𝙩𝙚𝙘𝙩𝙨 evolve into senior technical leadership or become consultants specializing in complex technical solutions. - 𝙎𝙤𝙛𝙩𝙬𝙖𝙧𝙚 𝘼𝙧𝙘𝙝𝙞𝙩𝙚𝙘𝙩𝙨 lead to tech leadership roles, especially in product-driven companies. - 𝘿𝙖𝙩𝙖 𝘼𝙧𝙘𝙝𝙞𝙩𝙚𝙘𝙩𝙨 advance into data strategy leadership roles or become Chief Data Officers (CDOs) as organizations focus more on data governance and analytics. ________ 👍 Hit Like so I know you enjoyed this. ♻️ Repost with your own thoughts for your network. ➕ Follow me, Kevin Donovan, for more. 🔔 ________ 🚀 Join the Architects' Hub! Unlock more of our 3-𝙖𝙘𝙩𝙞𝙤𝙣𝙖𝙗𝙡𝙚-𝙩𝙞𝙥𝙨 with our coming newsletter. We aim to connect you with a community that gets it. Dive into a network of peers who challenge the status quo. Ready to level up? Improve your skills, meet peers, and elevate your career! Click and Subscribe 👉 https://lnkd.in/dgmQqfu2 -- Photo by Annie Spratt
To view or add a comment, sign in
-
🌟 **Being workmanlike, and proud of it, as a Data Migration Consultant** 🌟 As a Data Migration Consultant, I find myself entrusted with helping a partner translate the way their business works from a solution they’re getting away from to the solution that I support. There’s lots of cleaning and transforming data along the way. That work isn’t the sort of thing that lends itself to being in a spotlight, it’s the sort of work that is very grounded, very workmanlike in nature. Each project presents new challenges and opportunities to collaborate with partner companies and with the incredibly talented folks in multiple teams across the company I work for. 🔍 **Connecting the Dots:** At its core, data migration is about reshaping data that used to fit one application model to fit a new model. It's like solving a complex puzzle where every piece matters. There’s a lot of satisfaction in doing that reshaping, and seeing the partner company able to conduct their business more efficiently afterwards. 🚀 **Driving Innovation:** In today's data-driven world, seamless integration is crucial for businesses to thrive. A smooth data migration enables organizations to harness the full potential of their data, using a better software solution to conduct their business. My work empowers companies to be more agile, responsive, and competitive. 🤝 **Collaboration is Key:** Successful data migration requires collaboration across teams and disciplines. Working with talented professionals from various fields enriches my perspective and enhances our collective ability to deliver robust solutions. There is no way that I could do what I do without the advice and support of developers, DBAs, business analysts, solution designers, and project managers, among others. 📚 **Lifelong Learning:** Technology evolves rapidly, and staying ahead means continuous learning. From learning how to use Python for everything from ad hoc analysis to ETL, to the ways Azure Data Factory is beginning to cast a shadow over the tried and true SSIS, there’s always something new developing in the world of data migration and integration. 🌐 **Impacting Lives:** The work we do has a profound impact. The intersection of health care and finance is significant in all our lives, often at times of great stress. Whether it's improving customer experiences, streamlining operations, or giving a partner company new and novel ways to better serve their consumers, our contributions touch countless lives in meaningful ways. Knowing that my efforts help drive progress and innovation is incredibly fulfilling. It’s mapping, translating, and collaborating until things work the way they need to for the partner to have a good experience going forward with their new solution. 🚀💡 #DataIntegration #DataMigration #DigitalTransformation #Collaboration #Inspiration #CareerJourney
To view or add a comment, sign in
-
Data Management Architect
Data Management Architect - CO-WORKER TECHNOLOGY
jobs.co-workertech.com
To view or add a comment, sign in
-
Data Architect @ Logistics Management Institute by via all AI news ([Global] oracle cloud) URL: https://ift.tt/xORTeZ8 Overview Current provider and healthcare directories are often inaccurate, fragmented, burdensome to maintain, rarely support interoperable data exchange or public health reporting, and are overall costly to the health care industry. A potential solution to this deficit is developing a centrally managed single source of truth National Directory of Healthcare (NDH). LMI is seeking an experienced Data Architect to support a team building this NDH. This directory will serve as a single source of truth for numerous stakeholders that may include: insurance companies, independent researchers, VA, CMS, CCIIO, OBRHI, and any other users interested in knowing information about providers. This position may be remote. LMI: Innovation at the Pace of Need™ At LMI, we’re reimagining the path from insight to outcome at the new speed of possible. Combining a legacy of over 60 years of federal expertise with our innovation ecosystem, we minimize time to value and accelerate mission success. We energize the brightest minds with emerging technologies to inspire creative solutioning and push the boundaries of capability. LMI advances the pace of progress, enabling our customers to thrive while adapting to evolving mission needs. Responsibilities Oversees data architecture for large scale API and web application back end data store. Implements ETL processes to supply application data for usage in the web application. Develop packages and scripts for system enhancements and interfaces. Develops scripts to validate the various data on systems. Involve in writing queries to improve the database performance and availability. Must use process through the ETL and refresh to identify, troubleshoot, and correct code problems. Develop, implement, and execute a quality assurance program and quality control standards for all activities. Executes statement to create and update various tables and views utilize for testing of code on a database. Establish database backup/recovery strategy using user-manage backup. Engineer extensive database solutions and interfaces enabling performance of data requests. Develop reusable components for deployment in a traditional data warehouse environment. Standardize database maintenance activities by standardizing directory structure, documentation of DBA activities. Establish backup and recovery options and implement automation of many DBA utility functions. Document database design, data definition language, define data migration strategy between different products/versions and develop data migration procedures. Qualifications Minimum of 5 years’ experience designing, developing, and maintaining enterprise database systems. Experience with cloud database development and implementation (AWS preferred). Bachelors Degree is required. Understanding of data migration procedu...
Data Architect @ Logistics Management Institute by via all AI news ([Global] oracle cloud) URL: https://ift.tt/xORTeZ8 Overview Current provider and healthcare directories are often inaccurate, fragmented, burdensome to maintain, rarely support interoperable data exchange or public health reporting, and are overall costly to the health care industry. A potential solution to this deficit ...
aijobs.net
To view or add a comment, sign in
-
Data Architect @ Logistics Management Institute by via ai-jobs.net ([Global] oracle cloud) URL: https://ift.tt/xORTeZ8 Overview Current provider and healthcare directories are often inaccurate, fragmented, burdensome to maintain, rarely support interoperable data exchange or public health reporting, and are overall costly to the health care industry. A potential solution to this deficit is developing a centrally managed single source of truth National Directory of Healthcare (NDH). LMI is seeking an experienced Data Architect to support a team building this NDH. This directory will serve as a single source of truth for numerous stakeholders that may include: insurance companies, independent researchers, VA, CMS, CCIIO, OBRHI, and any other users interested in knowing information about providers. This position may be remote. LMI: Innovation at the Pace of Need™ At LMI, we’re reimagining the path from insight to outcome at the new speed of possible. Combining a legacy of over 60 years of federal expertise with our innovation ecosystem, we minimize time to value and accelerate mission success. We energize the brightest minds with emerging technologies to inspire creative solutioning and push the boundaries of capability. LMI advances the pace of progress, enabling our customers to thrive while adapting to evolving mission needs. Responsibilities Oversees data architecture for large scale API and web application back end data store. Implements ETL processes to supply application data for usage in the web application. Develop packages and scripts for system enhancements and interfaces. Develops scripts to validate the various data on systems. Involve in writing queries to improve the database performance and availability. Must use process through the ETL and refresh to identify, troubleshoot, and correct code problems. Develop, implement, and execute a quality assurance program and quality control standards for all activities. Executes statement to create and update various tables and views utilize for testing of code on a database. Establish database backup/recovery strategy using user-manage backup. Engineer extensive database solutions and interfaces enabling performance of data requests. Develop reusable components for deployment in a traditional data warehouse environment. Standardize database maintenance activities by standardizing directory structure, documentation of DBA activities. Establish backup and recovery options and implement automation of many DBA utility functions. Document database design, data definition language, define data migration strategy between different products/versions and develop data migration procedures. Qualifications Minimum of 5 years’ experience designing, developing, and maintaining enterprise database systems. Experience with cloud database development and implementation (AWS preferred). Bachelors Degree is required. Understanding of data migration procedu...
Data Architect @ Logistics Management Institute by via ai-jobs.net ([Global] oracle cloud) URL: https://ift.tt/xORTeZ8 Overview Current provider and healthcare directories are often inaccurate, fragmented, burdensome to maintain, rarely support interoperable data exchange or public health reporting, and are overall costly to the health care industry. A potential solution to this deficit ...
aijobs.net
To view or add a comment, sign in
-
What TECHNICAL TERMS go over your head in project meetings? Business analysts are often confronted with technical jargon and acronyms that seem overwhelming. By familiarizing yourself with these acronyms and their implications, you can contribute more meaningfully to project teams. Check out some technical terms you should know and understand as a BA: - API - Application Programming Interfaces define the methods and data formats used for communication between different software systems, allowing data exchange and integration. They enable the sharing of data across systems and platforms by providing endpoints for accessing data and services. - DBMS - Database Management System is software that provides a systematic way to store, retrieve, and manage data in databases. It ensures data integrity, security, and efficiency in data handling operations. - ERP - Enterprise Resource Planning systems centralize data across an organization, providing a single source of truth for business operations. They manage data related to finance, human resources, supply chain, and other core functions, ensuring data consistency and accessibility. - ETL - Extract, Transform, and Load processes involve extracting data from source systems, transforming it into a suitable format, and loading it into a destination, such as a data warehouse. This is crucial for consolidating data from multiple sources, cleaning it, and preparing it for analysis. - JSON - JavaScript Object Notation is a lightweight data-interchange format that is easy to read and write for humans and machines. It is used to transmit data between a server and a web application, often in web services and APIs. - SaaS - Software as a Service provides software applications over the internet, with data managed and stored in the cloud. It allows for scalable data access and management without the need for on-premises infrastructure. - SQL - Structured Query Language is used to query, update, and manage data in relational databases. It allows users to perform operations such as data retrieval, insertion, update, and deletion, supporting data manipulation and analysis. - XML - Extensible Markup Language is used for structuring, storing, and exchanging data between systems. It defines a set of rules for encoding documents in a format that is both human-readable and machine-readable, facilitating data interchange. P.S. The upcoming Certification in Business Data Analytics (CBDA) Online Prep starts on August 17th! Registration closes on August 12th. Get the details here: https://lnkd.in/gmP4NE7 #businessanalysis #dataanalysis #businessanalsyt #dataanalyst #dataanalytics #technicalterms #technicaljargon #iiba
To view or add a comment, sign in
Sr. Bench Sales Recruiter | Add me to your C2C roles distribution list (samnandan52@gmail.com)
8mosamnandan52@gmail.com