#DataArchitect #DataWarehouseDesign #DataEngineering #BusinessIntelligence #DatabaseDesign #ETLDevelopment #DataValidation #PerformanceTuning #MSBI (Microsoft Business Intelligence) #FinanceData #SQLSSIS #PowerBIReports #AnalyticsSolutions #DataDrivenDecisions #TechCollaboration #QualityAssurance #DataModeling #DataIntegration #InsurTech (Insurance Technology) #TestCases
Deepesh Pratap Singh’s Post
More Relevant Posts
-
How to re-engineer a business process: 1. Identify gap during testing target state SF spot capital ERCF automated reporting (for which I also designed and developed SQL) in lower environment. Drill down the report data to isolate missing SF CRT deals as the source of the discrepancy. 2. Confirm these are new issue CRT deals and their capital relief effective month by reviewing private placement memorandum or term sheet. Trace the data flow upstream from database source for the SQL to the SF CRT model engine and confirm the engine is not vending spot capital data to the data warehouse for these new CRT deals. 3. Set up meeting with finance manager and SF CRT model system senior engineer to review findings. Confirm the model engine is not processing new CRT deals because of latency in inputs (including 3rd party source), also confirm no production gaps as the current process accepts modeled outputs for new CRT deals from upstream system. 4. Draft multi-month roll-over target-state integrated business/data/engine process flow identifying inputs/outputs, sources/destinations addressing input latency or gaps. Review internally with senior finance director and CRT model engine director. 5. Set up meetings with upstream stakeholders and lead/facilitate meetings to review proposed flow with senior directors/POCs: CRT business and modeling teams, CRT data team, data team upstream from model engine. Align stakeholders on deliverables, SLAs, and communication channels. 6. Design and execute system integrated testing with all stakeholders, systems, data sources, automated and manual processes. Validate automated SF spot capital ERCF report, share report and validation results with finance team for review and approval. 7. Assist stakeholders with updating procedures and internal controls as needed.
To view or add a comment, sign in
-
Update Operation The update operation updates a single instance of an object with checks for concurrent updates. The part of the business object susceptible to the update can be limited by the defined business object view. The update operation may add and/or remove parts of the business object but it is not meant to be used for status changes. A confirmation message is always sent. The update operation checks for concurrent updates. It checks the ChangeStateId element to determine whether somebody else has changed the same business object instance since the last read. If this is the case, then an error message is returned. Result Query Operation When defining a query operation in the Web Service Creation Wizard, you can only use queries that already exist for the business object. The query operation retrieves specified information in business object instances. Query operations only read data; there is no change of any persistent data. A response message is always sent. The basic query naming pattern is: Find[View]SimpleBy[Selection criteria] Where: The business object view defines the response structure. The selection criteria view splits the operation so that each operation offers a semantically meaningful set of selection criteria typically needed for a given purpose. It is determined from the name of the underlying business object query. Maximum Number of Rows Returned You can define the maximum number of rows returned by a query using the following elements of the ProcessingConditions element: QueryHitsUnlimitedIndicator QueryHitsMaximumNumberValue If the QueryHitsUnlimitedIndicator is not set and the QueryHitsMaximumNumberValue is zero, the default of 100 rows is assumed. Comparisons Permitted You can use the following types of comparisons in Web service query operations: -Equals -Between -Less than -Less than or equal to -Greater than -Greater than or equal to Result Action Operation When defining an action operation in the Web Service Creation Wizard, you can only use actions that already exist for the business object. The action operation changes the state of a business object instance. It is not intended to be used to modify data. The request message types are typically very short; they often contain just the business object ID. A response message is always sent. Result
To view or add a comment, sign in
-
Why did the data catalog breakup with it's partner? because it couldn't handle so many relationships... on a serious note though, how many entities can your catalog responsively support? how many do you need to support, how does your catalog scale? Alex Solutions supports tens of millions, quite frankly, if the infrastructure is big enough to support the knowledge graph in all its glory, the limits are probably relatively boundless. but it remains an important question. Consider that a table is just one entity, if it has ten attributes it just jumped to 11 entities and those 10 attributes all have a relationship with the table. Your database now contains at least 21 records. Start adding views, stored procedures, ETL and reporting applications, it very quickly exploded to hundreds and then thousands and then millions of entries. Start adding data people, controls, technology describers and business processes, quality measures, KPIs and metrics and it extends further. Can your catalog adequately serve up the answers you need in this context?
To view or add a comment, sign in
-
𝐃𝐚𝐭𝐚 𝐛𝐚𝐜𝐤𝐟𝐢𝐥𝐥𝐢𝐧𝐠 is a common task for data engineers, especially when working with large and constantly changing datasets. It refers to the process of loading historical data into a system or pipeline that was either missed, incorrectly processed, or not yet available. 𝐖𝐡𝐲 𝐝𝐨 𝐲𝐨𝐮 𝐧𝐞𝐞𝐝 𝐝𝐚𝐭𝐚 𝐛𝐚𝐜𝐤𝐟𝐢𝐥𝐥𝐢𝐧𝐠? Some common reasons include new business requirements, correcting data quality issues, migrating legacy systems, or recovering from downtime in a data pipeline that needs to be caught up. 𝐁𝐞𝐬𝐭 𝐏𝐫𝐚𝐜𝐭𝐢𝐜𝐞𝐬 𝐟𝐨𝐫 𝐁𝐚𝐜𝐤𝐟𝐢𝐥𝐥𝐢𝐧𝐠 𝐃𝐚𝐭𝐚 - Break down large backfills into smaller, manageable batches (parallelization) to minimize system impact and allow for better error handling - Check the quality and integrity of the source data before beginning the backfill process - Ensure that your backfill jobs are idempotent, meaning they can be run multiple times without causing duplicate or incorrect data. This is critical when dealing with failures or retries - Be mindful of schema versions when backfilling, especially if your data processing logic has evolved. Ensure transformations applied to historical data match the expected schema and business logic
To view or add a comment, sign in
-
Data file exchange is a game-changer for any business that needs to import data files into its systems, but what is it? 📂🚀 Find out what data exchange is and how it can help you safely and efficiently collect, validate, transform and import data files from customers, suppliers, partners, agents or other unmanaged external sources: https://buff.ly/3S68tSx #developers #csv #datamanagement #dev #dataonboarding
Data file exchange: What it is and why it's essential for businesses today
flatfile.com
To view or add a comment, sign in
-
If you know words #dataimport, #dataonboarding, #dataexchange, then you need to read this. And if you know about data import pain, data validation, manual data transformation, you MUST read it.
Data file exchange is a game-changer for any business that needs to import data files into its systems, but what is it? 📂🚀 Find out what data exchange is and how it can help you safely and efficiently collect, validate, transform and import data files from customers, suppliers, partners, agents or other unmanaged external sources: https://buff.ly/3S68tSx #developers #csv #datamanagement #dev #dataonboarding
Data file exchange: What it is and why it's essential for businesses today
flatfile.com
To view or add a comment, sign in
-
Have You Experienced Shape-Shifting Data? A Business person is looking at a report and thinks one of the numbers is not correct. They go to the architect or responsible IT person that manages the pipelines and say: "I don't think the number 3 here is correct". The architect comes back saying, "I checked the source system and in table ABC column 123 has the number 3 in it, so it's correct." Can a number be both correct and incorrect at the same time? Yes, it can. It might be that the 3 is in the database but it's a value that should not be there (human error). So the number 3 would then be technically correct, but business contextually incorrect. It might be that the 3 is in the database but it's actually time sensitive data, so the number 3 is technically correct but its not correct for the person viewing it in this timezone. It might be that the 3 is in the database and it has been correct but the source system logic has changed in the last release, so the number 3 is technically correct, but business contextually incorrect as it is outdated. The only way to talk about data correctly is to have the same understanding of what it means for the number to be correct. You need to have an agreement about what is correct and have the attributes of what "correct" means recorded somewhere in a central place (with proper ownership). #nakyvamuutos #bringeyourdatagap Bonus fun for Friday. Can anyone guess where this picture is taken from? :)
To view or add a comment, sign in
-
Data file exchange is a game-changer for any business that needs to import data files into its systems, but what is it? 📂🚀 Find out what data exchange is and how it can help you safely and efficiently collect, validate, transform and import data files from customers, suppliers, partners, agents or other unmanaged external sources : #developers #csv #datamanagement #dev #dataonboarding
Data file exchange: What it is and why it's essential for businesses today
flatfile.com
To view or add a comment, sign in
-
As a data professional, it's a common requirement to understand the state of data as of a specific date, such as identifying changes made since a particular point in time. While creating a change log or implementing Change Data Capture (CDC) is a typical approach, it's often impractical to maintain such logs for all tables, especially in systems with a significant number of tables. In modern data warehouses or data systems, which often contain more than 50+ tables, maintaining change logs for many of them can become a substantial overhead. Data evolves constantly, and sometimes these changes are extensive, particularly in data warehouse environments. When debugging data issues, having the ability to view the state of the data as of a specific date can be invaluable. Here a small posts https://lnkd.in/eN9af2VC
To view or add a comment, sign in
-
👀 Struggling with manual financial data processing? 🎯 Enhance your financial decision-making with automated data processing for improved efficiency. Imagine no longer facing last-minute manual data preparation challenges from your team. Experience seamless handling of millions of records from various systems, yielding consistent and Measurable Results® every time. 💡 Find out how automated data processing can revolutionize your decision-making process and save valuable time for your financial team! ✅ Click here for more: https://hubs.li/Q02F8rHp0 Authored by: Jacob McClendon Matillion Snowflake Tableau #SVA #SVADataAnalyticsandVisualization #Matillion #FinancialManagement #Automation #DataProcessing
Automated Data Processing Helps Improve Decision-Making
consulting.sva.com
To view or add a comment, sign in