We are hiring for a Data Management: Solution Architect! In preparation for an organisational move to SharePoint online, CARE International UK is looking to carry out a review of its data, content and knowledge management structures in order to understand data and information needs and behaviours. We are looking for: 🟠 A Data Architect or similar role 🟠 Someone with proven experience in designing and implementing complex data systems and architectures 🟠 Strong experience with SharePoint and cloud technologies 🟠 In-depth understanding of data modelling and architecting data solutions across hybrid platforms 🟠 Expertise in creating strategies that enhance data accessibility, insights and analytics Is this you? Find out more and apply here: https://lnkd.in/enJZyvbF #Hiring #Data #DataArchitect #WeAreHiring #Jobs #CharityJobs
CARE International UK’s Post
More Relevant Posts
-
#Hiring #C2C #PowerBIArchitect Title: PowerBI Architect Location: 100% Remote Visa: USC, GC , EAD's Please Share your resume to pavan@symploreus.com Required Skills:- 1. Microsoft Power Business Intelligence - "Preferred to Bachelor's Degree in Computer Engineering/Computer Science. Strong understanding of Power Platform concepts, Data Engineering and Analytic concepts and Experience with Power Platform Applications and Visualization Tools. 2. Basic knowledge on Linux and Windows OS administration activities, and good understanding of o365 applications and fundamentals. 3.Preferred to have PowerBI certification. Experience with monitoring and ticket handling processes. 4. Good communication skills, Strong analytical and problem-solving skills. Responsibilities: · Advanced Error Debugging related to reporting, data sources, networking, and security protocols. · Ability to understand requirements for fundamental access for different databases and data sources. · Advanced ability to code fundamental scripts to execute API functions and create automation, understanding functional ways to deploy such scripts as well as planning for how users will be hitting the script. * Profound Understanding of PowerBI, Power Apps, Power Automate, O365 Offerings, as well as subsidiary products and features. ·Monitor platform health through periodic manual checks and ensure availability. Ability to think fast and help resolve any disturbance if prompted. · Monitor Premium Capacity Health and maintain health of the capacity if needed. ·Windows Security and periodic monthly patching activities. Basis understanding of Security Vulnerabilities, CVEs and Remediation of such CVEs. ·Raising necessary tickets and escalations for any compliant need such as firewall or database. · Extremely good and positive user communication skills. · Profound understanding of SQL for query improvements and debugging. · Address all known issues/vulnerability fixes using SOP documentation. · Pilot new requests for non documented procedures. ·Auditing Adherence, Control Generation, and deep understanding of audit controls · Road Mapping, Team Management, Rostering and Team Formatting * Piloting New Ideas and New Technologies
To view or add a comment, sign in
-
🔵 I've recently completed a comprehensive Business Intelligence (BI) tools course from Skillsoft, gaining valuable insights into the powerful capabilities of various BI tools. This course equipped me with the knowledge and skills to effectively extract, analyze, and visualize data to drive informed decision-making. The course covered essential BI concepts, including: ETL with SSIS: Ensuring data quality through extraction, transformation, and loading. Data Warehousing: Optimizing data storage with data marts. Data Mining (SSAS): Using cubes for in-depth #analysis. Reporting & Visualization: Creating reports with #PowerBI, #Tableau, and #Excel. Security: Implementing access control, encryption, and retention policies. #BusinessIntelligence #DataAnalysis #DataTransformation #ETL #SSIS #DataWarehousing #SSAS #DataMining #PowerBI #Tableau #Excel #DataVisualization #BISecurity #DataGovernance #SharePointBI #SQLServer #PerformancePoint #PowerPivot #ReportingServices #DataQuality #GulfJobs #BusinessIntelligenceJobs #BIJobs #DataAnalystJobs #ETLJobs #DataWarehouseJobs #PowerBIJobs #SQLJobs #DataScienceJobs #SSISJobs #SSASJobs #DataEngineerJobs #DataMiningJobs #MiddleEastJobs #UAEJobs #QatarJobs #SaudiJobs #KuwaitJobs #OmanJobs #BahrainJobs #BIEngineer #TechJobsGulf #GulfTechJobs #iec #EuropeJobs #BusinessIntelligenceJobs #BIJobs #DataAnalystJobs #ETLJobs #DataWarehouseJobs #PowerBIJobs #SQLJobs #DataScienceJobs #SSISJobs #SSASJobs #DataEngineerJobs #DataMiningJobs #UKJobs #GermanyJobs #FranceJobs #NetherlandsJobs #IrelandJobs #SpainJobs #TechJobsEurope #BIEngineer #EUJobs #DataJobsEurope #opentowork #jobs #hiring #BIJobs #BusinessIntelligenceJobs #DataAnalystJobs #DataScienceJobs #ETLJobs #SQLJobs #PowerBIJobs #SSISJobs #SSASJobs #DataEngineerJobs #DataWarehouseJobs #DataMiningJobs #TechJobs #GlobalJobs #WorldwideJobs #RemoteTechJobs #BIEngineer #AnalyticsJobs #DataJobs #DataVisualizationJobs #BIJobs #BusinessIntelligenceJobs #DataAnalystJobs #AccountingJobs #FinanceJobs #AdminJobs #ETLJobs #SQLJobs #PowerBIJobs #SSISJobs #DataScienceJobs #AdminAssistantJobs #FinancialAnalystJobs #DataEngineerJobs #AccountsPayableJobs #AccountsReceivableJobs #TechJobs #WorldwideJobs #RemoteJobs #GlobalJobs #AdminSupportJobs #DataAndAccountingJobs #OfficeAdminJobs #FinanceAndBIJobs #GulfJobs #AccountingJobs #AdminJobs #FinanceJobs #AccountsJobs #UAEJobs #QatarJobs #SaudiJobs #KuwaitJobs #OmanJobs #BahrainJobs #AccountingAssistant #AdminAssistant #FinanceManager #AccountsReceivable #AccountsPayable #OfficeAdminJobs #FinanceAdministrator #AdminSupportJobs #GulfRegionJobs #GulfCareer #AccountingAndAdmin #DeloitteGulf #PwCGulf #KPMG #EYGulf #TataConsultancyServices #AccentureGulf #ErnstAndYoung #BDO #AlFuttaim #EmiratesGroup #DPWorld #AlHabtoorGroup #JumeirahGroup #NissanGulf #GulfAir #Adnoc #QatarPetroleum #SaudiAramco #NakheelProperties #Mubadala #EtihadAirways #GulfBank #DeloitteUAE #PwCUAE #KPMG #EYDubai #TataConsultancyServices
To view or add a comment, sign in
-
📅 DAY 32: Connecting to Various Data Sources in Power BI 🔗 Power BI makes it easy to connect and visualize data from multiple sources! Here’s a quick snapshot of key connections to supercharge your analytics: 🗂 File-Based Excel, CSV, PDF, JSON – Import directly or connect live for dynamic updates. 🏢 Database SQL Server, MySQL, PostgreSQL, Oracle – Integrate enterprise databases with ease. ☁️ Cloud Azure Data Lake, Amazon Redshift, Google BigQuery, Snowflake – Scale with cloud-based data. 🌐 Online Services SharePoint, Dynamics 365, Salesforce, Google Analytics – Pull in data for rich, contextual insights. 🌎 Web Data & APIs Web Pages, REST APIs – Scrape or connect to custom sources for real-time updates. 🔄 Real-Time Streaming Azure Stream Analytics, Power BI REST API, Event Hubs – Stream live data to your dashboards. Tips for Best Results 🧹 Prepare Data before importing for cleaner analysis. 🔒 Secure Connections with Power BI Gateways. ⚡ Optimize Performance for smooth, real-time reporting. Power BI connects to over 100 data sources—the possibilities are endless! 🌐💼 #entri_elevate #75DaysOfDataAnalysis #DataChallenge #LearningDataScience #DataScienceJourney #opentowork #dataanalystjob #datasciencejob #job #jobhunt #ITjob #remotejob #uaejob #UAE #DataScienceJobs #HiringDataScientists #UAEJobs #RemoteWork #WorkFromHome #TechCareers #AIJobs #WomenInTech #DataScienceOpportunities #GlobalHiring #MachineLearningJobs #75DaysOfDataAnalysisChallenge #EntriElevate #DrJithaPNair #MySQL #SQL #DatabaseManagement #DataScience #DataAnalysis #SQLQueries #DataStorage #DatabaseMagic #PowerBI #PowerBIDashboard #PowerBITips #DataModeling #BusinessIntelligence #Visualization #DigitalTransformation #EnterpriseData #TechInnovation #CloudData #BusinessAnalytics #DataEngineering
To view or add a comment, sign in
-
Hi Folks, Immediate Need of: Power BI Admin Houston TX – 100% Day One Onsite - Need Locals or Nearby 12+ Months Contract Need overall 10+ Years and above resumes with Linkedin Id and PP# Job Description: Understands all about Power BI Moving access control Creating workspace Responsibilities (in bold must be seen on resume): Responsible for providing full Power BI service administration – user management, workspace management, gateway administration, data source creation, monitoring, and troubleshooting. Migrate from Pro licensing to Premium licensing including converting workspaces, premium capacity monitoring, configure/manage workloads, and optimizing premium capacities. Administering and managing the on-premises gateway server including patching and supporting custom connectors Troubleshoot reporting/connectivity issues related to gateway. Assist in performance tuning of reports using Tabular Editor or DAX Studio Work with data owners and report developers to support their needs in the best way possible Grow and integrate Power BI Service by designing and deploying workspaces, data models, app tiers, gateways, report services, dashboard support. Working in Power BI admin portal set up tenant settings, set organization visuals, and more. Must Have: Data Source Mgmt Gateway Admin Power BI Premium Security Please share your resumes to: venkat@sourcemantra.com #powerbiadminjobs #powerbiadmin #powerbiadministratorjob #recruiters #usitbenchsales #usitbenchmarketing #usitsales #hotlist #c2crequirements #c2cvendors #jobseekers #c2chotlist #c2cusajobs #c2cconsultant #itrequirements #contract #USC #requirement #urgentrequirement #daily #usrecruitment #itrecruiters #itstaffing #longterm #candidates #benchrecruiters #recruitmentcareers #recruitmentagency #vendorlist #c2crequirements #updating #benchsales #email #vendors #hotlist #implementationpartner #usstaffing #c2c #benchsalesrecruiters #requirementsmanagement #candidates #requirements #databases #recruiters #usitstaffing #corptocorp
To view or add a comment, sign in
-
#hiring #c2c #w2 #1099 #contract #moutainview #california #hybrid #sivak@arkhyatech.com New Role : Tableau Admin : Mountain View CA (Hybrid) Skills Tableau Administrator Athena, Redshift, AWS JD Install, configure, upgrade, restacking tableau Providing user access Server Performance and Trouble shooting Performing Health checks of the server Server log analysis to optimize the scalability and performance Managing LDAP and SAML authentication #tableauadmin #Athena, #Redshift, #AWS
To view or add a comment, sign in
-
Hi Marnix Pot, I don't know if someone else already answer that, but here It is the approach that #chatgpt suggest 😎 : "Your challenge is a common one when managing multiple independent Power BI semantic models that require parameterization for unique configurations while maintaining centralized control over development. Here are a few potential approaches to address your issue effectively: 1. Use an External Metadata Source to Manage IDs Dynamically Instead of relying on Power BI parameters, you can use a central metadata table stored in a database or SharePoint list to dynamically map each semantic model to its respective business unit's data source. This method avoids parameter overwrites. Steps: - Create a metadata table that maps business unit names or IDs to their corresponding data source paths. - Load this metadata table into Power BI. - Use a DAX measure or calculated column to filter the metadata dynamically based on environment-specific variables. - Apply the filtered value to dynamically set the data source or filtering logic within M-Query. Advantages: - External metadata management means no need to edit parameters in the Power BI service. - Metadata can be updated independently of the semantic model. ..."
𝐇𝐞𝐥𝐥𝐨 𝐏𝐨𝐰𝐞𝐫 𝐁𝐈 𝐄𝐧𝐭𝐡𝐮𝐬𝐢𝐚𝐬𝐭, I am looking for a Power BI genius who can help me! I am facing a development challenge and I am hoping that somebody in my network has a workable solution. 𝐂𝐚𝐬𝐞: I am creating a semantic model for 20 business units. They deliver their data through CSV exports multiple times a day. Because of this, they would like to see their data fast and up to date. Data size can also vary per business unit. Because of the above, I was looking for a way to give every business unit its own semantic model. They can then deliver data separately from each other. This brings the big benefit that every semantic model is business unit dependent (data-wise) and can work independently from the other business units. If one business unit has a lot of data and therefore a long refresh, the business unit with a low amount of data and therefore fast refresh is not affected. For this to work, I needed a way to centralize my semantic model code (M-Query and Power BI). I am not going to make a change to my semantic model 20 times. I fixed this by using the 𝐒𝐡𝐚𝐫𝐞𝐏𝐨𝐢𝐧𝐭 𝐒𝐲𝐧𝐜𝐡𝐫𝐨𝐧𝐢𝐳𝐚𝐭𝐢𝐨𝐧 option in Power BI service, where I synchronize my semantic model in SharePoint with a semantic model in the Power BI service (times 20). Every change I make to my Power BI desktop file will be automatically synchronized to the service (times 20). I never have to publish again and have one centralized way of development. 𝐌𝐲 𝐜𝐡𝐚𝐥𝐥𝐞𝐧𝐠𝐞: To define the right data source path per semantic model, I use parameters. I then change the parameters in the Power BI service to the right business unit and this defines the data per semantic model. My issue here is that my parameters are being overwritten when making a change to my SharePoint desktop file (publishing). This makes sense because it looks at the default code and puts it back to the original state. I wish it would remember the parameter value, but it does not. What I am looking for is a way for M-Query to generate or calculate a semantic model-specific value (like an ID or name) which I then can use as a filter instead of a parameter. I already tried: - Number.Random - List.Random - List.Random + Seed - DAX DMV’s - Dynamic Parameter (with direct query) - #shared - Power BI Rest API - Power Automate I really hope there is a genius in my network who can help me with the right out-of-the-box solution. Thank you so much in advance, Marnix #Mquery #DAX #PowerBI #SharePoint #Challenge
To view or add a comment, sign in
-
𝗗𝗮𝘆 𝟰: 𝗖𝗼𝗻𝗻𝗲𝗰𝘁𝗼𝗿𝘀 Hello, everyone! Today, we’re diving into the world of connectors in Power Apps which will help to interact with external data and services. Let’s see how connectors work and how you can use them to supercharge your apps!⚡ 🌉𝗪𝗵𝗮𝘁 𝗔𝗿𝗲 𝗖𝗼𝗻𝗻𝗲𝗰𝘁𝗼𝗿𝘀? Connectors act as bridges, enabling your app to communicate with various data sources and services. Think of them as digital gateways that allow you to pull in data from different systems, process it, and display it in your app. Whether you need data from SharePoint, SQL Server, Excel, or even social media platforms, connectors make it happen seamlessly.🔄 🛠️𝗧𝘆𝗽𝗲𝘀 𝗼𝗳 𝗖𝗼𝗻𝗻𝗲𝗰𝘁𝗼𝗿𝘀 🥇 Standard Connectors: These are pre-built and cover a wide range of commonly used services like SharePoint, Office 365, and Excel. They’re easy to set up and use, making them perfect for getting started quickly. For example, the Office 365 Users connector can fetch user profiles, including details like email, department, and even profile pictures. Imagine building an employee directory with this! 🥈Premium Connectors: These offer more specialized integrations and may require additional licensing. Services like SQL Server, Salesforce, and Azure are available as premium connectors, providing advanced functionalities and integrations. For instance, you can use the SQL Server connector to interact with your enterprise databases, performing CRUD operations directly from your app.📊 🥉Custom Connectors: When you need something specific that’s not available out of the box, you can create custom connectors. These are built using APIs and can be tailored to meet your unique business needs. For example, if you have a proprietary CRM system, a custom connector can bring in customer data, enabling your app to provide personalized services. 🔗𝗛𝗼𝘄 𝘁𝗼 𝗔𝗱𝗱 𝗮 𝗖𝗼𝗻𝗻𝗲𝗰𝘁𝗼𝗿 👉 Navigate to the Data Section: In Power Apps Studio, click on “Data” and then “Connections.” 👉 Add a New Connection: Click “+ New Connection,” choose your desired service, and follow the prompts to authenticate. For instance, when connecting to SharePoint, you’ll need to log in with your credentials. 👉 Bind to Controls: Once connected, you can use this data in various controls like galleries, forms, and charts. It’s straightforward and incredibly powerful! ✨Real-World Use Case: Consider a retail company that needs to manage product inventories. By using the SQL Server connector, the company can build an app that accesses real-time inventory data. Store managers can update stock levels, reorder products, and view sales reports, all within the app. This integration ensures that data is consistent and up-to-date across all channels.📦 🎤Let’s Chat! What data sources do you frequently use in your organization? Are there any specific connectors you’re excited to try out in Power Apps? Share your thoughts !💬👇 #PowerApps #PowerPlatform #Connectors #SharePoint #PowerBI #SQL
To view or add a comment, sign in
-
🔰 Power BI Service Load ramp - up ➡️Data team Head informed us that there is a degraded experience from Power BI reports since couple of weeks. There are 14 different reports in one of the group which makes up top 50% of consumers of Power BI Service capacity. 📌Issue is that over past two weeks usage is more from the statistics as shared. Troughs are at the weekends which are increasing in the background usage over the past few months. However, as there are many reports which are performing hourly refreshes all weekend long, the higher the trough on the weekend, the more likely to overload the capacity on Monday/Tuesday during the ramp-up. 📍Due to how Microsoft applies the throttling policy (a detailed blog from Microsoft - https://lnkd.in/gfUhkQpe), means it can be hours or days before the service recovers. ▶️ This issue impacts everyone across the entire capacity since overall share was top 50% of the load. It was requested to check the below list of items aa part of optimization techniques ⤵️ 1️⃣Switching off Automated Refresh, and instead use Power Automate flow to trigger data refresh when the data is uploaded to SharePoint. 2️⃣Reduce / eliminate Grouping, Merging, Pivoting, Unpivoting, List.Max / Table.Max Power Query steps in your dataflows. 3️⃣Turn off Q&A on the dataset. 4️⃣Avoid leaving default table views unfiltered, especially for large tables 5️⃣Optimizing Long Duration Dataflows - https://lnkd.in/gN3y-NJ9 6️⃣Optimizing Power BI operations - https://lnkd.in/gmxBRdhc ⏩️Need to ensure that weekend trough to be below 30% in order to sustain for a significant period of time. 🛢Coming to the report I had worked on has the highest impact (compared to other reports) and following are the details⤵️ 1️⃣Frequency of refresh - 12 times / day [hourly] 2️⃣Total duration of refresh (s) - 27646 3️⃣Total CU (s) - 700765.04 4️⃣CU % of Tot - 18.37% 5️⃣Total impact % of Tot - 15.91% ↔️Had a connect with report stakeholder and explained the impact report is effecting on the Power BI service load and as per usage metrics view, end users viewing the report is only 2 users in a week. Based on these factors, reduced the refresh to 3 times per day [hourly] and resulting the total impact % to 1.5%. #PowerBIService #Load #OptimizationTechniques
To view or add a comment, sign in
-
Deploying #Business #Intelligence (BI) #remotely involves setting up the necessary #infrastructure, tools, and #processes to enable #BI activities without #physical #presence in an office. Here's a step-by-step guide on how to deploy BI remotely: 1. Assess #Remote #Needs: Understand the specific requirements and constraints of remote BI deployment. Consider factors like data accessibility, network reliability, security, and user collaboration. 2. Select #BI #Tools: Choose BI tools that support remote access and collaboration effectively. Cloud-based BI platforms like Microsoft Power BI, Tableau Online, or Google Data Studio are popular choices as they offer remote access, scalability, and collaboration features. 3. Data Accessibility: Ensure that remote teams can access the necessary data securely. Utilize cloud-based data storage solutions like AWS S3, Google Cloud Storage, or Azure Blob Storage for centralized data storage and accessibility. 4. #Data #Integration: Implement robust data integration processes to collect, cleanse, and integrate data from various sources. Use ETL (Extract, Transform, Load) tools like Apache NiFi, Talend, or Informatica for seamless data integration. 5. #Security #Measures: Implement stringent security measures to protect sensitive data. Utilize encryption, access controls, VPNs (Virtual Private Networks), and multi-factor authentication to secure data access and communication channels. 6. #Remote #Access #Infrastructure: Set up remote access infrastructure to enable BI tools and #applications to be accessed #securely from anywhere. Utilize #VPNs, Remote #Desktop #Services (RDS), or #Virtual #Desktop Infrastructure (#VDI) solutions for secure remote access. 7. #Collaboration Tools: Implement collaboration tools to facilitate #communication and #collaboration among remote #teams. Utilize platforms like #Microsoft Teams, Slack, or #Zoom for #virtual meetings, discussions, and knowledge sharing. 8. #Training and #Support: Provide #comprehensive #training and #support to remote users on #BI tools, #processes, and #best practices. Conduct virtual training sessions, create #online #documentation, and establish support channels for addressing user queries and issues. 9. #Performance #Monitoring: #Monitor the #performance of BI systems and #infrastructure remotely. Utilize #monitoring tools and #dashboards to track #system #health, data availability, and performance metrics. Implement alerts and notifications to proactively address any issues that may arise. 10. #Continuous #Improvement: Continuously evaluate and improve remote BI deployment processes. Solicit #feedback from remote users, identify areas for #optimization, and implement #enhancements to enhance #efficiency, reliability, and user #satisfaction. By following these steps, #organizations can effectively deploy #BI remotely, enabling remote teams to #access, #analyze, and derive #insights from #data regardless of their location.
To view or add a comment, sign in
-
Rajan Malhotra Adarsh Pradhan Ankita Vrdhan Tanish Shukla Hello Folks, Hope you are doing well. I would like to present to you a new job opportunity and I think you may find it interesting. If you are interested kindly send the following documents to sarabhjotkour@smsoftconsulting.com by Friday, September 13, 11:30 AM EST if that interests you and matches your profile. Job Title: RQ07885 - Software Developer - Senior Client: Ministry of Solicitor General Work Location: 595 Bay Street, Toronto, Hybrid Estimated Start Date: 2024-10-21 Estimated End Date: 2025-03-31 #Business Days: 186.00 Extension: Probable after the initial mandate Hours per day or Week: 7.25 hours per day Security Level: Enhanced Must Have • Power BI: Proficiency in Power BI, including data modeling, creating interactive reports and dashboards and utilizing advanced features. • ETL/Microsoft Azure Synapse Analytics experience, including its architecture, data storage options, data ingestion mechanisms and security features. • Dynamics 365 CE and/or F&O report development experience. Nice to have: • Azure cloud certifications (e.g. Azure fundamentals, Azure Power BI Data Analyst Associate) • Dynamics 365 certifications Description Scope: • The OPGT is developing “PGTIMS”, a new Dynamics CE based solution that will modernize OPGT’s legacy applications and systems. This project requires migrating historical data from the legacy solutions to a new data warehouse/lakehouse, building all the new data assets in the lakehouse and creating new PowerBI reports. Deliverables: • As a member of the reporting team, you will be responsible deliver the MVP for reporting, build the new data warehouse/lakehouse and deliver the required reporting. A high-level list of deliverables for the data migration team follows: • Data Warehouse Design: analyze business requirements and design an efficient data warehouse/lakehouse/golden layer on Microsoft Azure • Data Integration and ETL: help develop and automate ETL processes to load data from Dynamics 365 CE and F&O into the new data warehouse/lakehouse (bronze/silver/golden layers). Help integrate data from diverse sources into the new data warehouse. • Data Modeling: create logical and physical data models that align with the project’s reporting and analytical needs. • Data Security and Governance: implement appropriate security controls to protect sensitive data within the data warehouse and reports. • Reporting and Visualization: develop reports, dashboards and visualizations as per reporting MVP. Ensure the quality, accuracy and integrity of reports. • Performance Optimization: fine-tune the data warehouse and reporting processes to optimize query performance and overall system efficiency. • Other duties as assigned. Skills and knowledge: • Data Warehouse/Lakehouse Concepts: Strong understanding of data warehousing principles, including data modeling, ETL processes, data integration, and data governance.
To view or add a comment, sign in
122,447 followers