Are you a Data Engineer or Data Scientist?

Are you a Data Engineer or Data Scientist?

In today’s data-driven economy, businesses depend on skilled professionals to turn raw information into actionable insights. Two of the most critical roles are the data engineer and the data scientist. While these titles are often mentioned together, their responsibilities, skills, and day-to-day work differ significantly.

If you’re considering a career path — or trying to hire the right talent — understanding the difference between a data engineer vs. data scientist is essential.

What Does a Data Engineer Do?

Data engineers are the architects and builders of an organization’s data infrastructure. They design, construct, and maintain the pipelines, databases, and platforms that make clean, reliable data available for analytics and business intelligence.

Core Responsibilities of Data Engineers

  • Designing and maintaining data pipelines (ETL/ELT processes)
  • Building and managing data warehouses and data lakes
  • Ensuring data quality, consistency, and scalability
  • Implementing security measures to protect sensitive information
  • Optimizing data systems for performance, cost, and efficiency

Key Skills for Data Engineers

  • Strong programming in Python, SQL, and Scala
  • Expertise in databases (SQL & NoSQL)
  • Familiarity with cloud platforms (AWS, Azure, GCP)
  • Big data tools: Qlik, Apache Spark, Hadoop, Kafka
  • Workflow orchestration tools like Airflow

👉 Explore more on how we help clients build scalable infrastructures in our Data Engineering Services page.

What Does a Data Scientist Do?

While engineers prepare the data, data scientists dive into it to uncover insights, predict outcomes, and inform decision-making. They apply statistics, machine learning, and AI to transform raw datasets into actionable intelligence.

Core Responsibilities of Data Scientists

  • Cleaning, exploring, and preparing data for modeling
  • Applying statistical analysis and machine learning algorithms
  • Building predictive and classification models
  • Visualizing complex results for technical and business audiences
  • Framing business problems as solvable data questions

Key Skills for Data Scientists

  • Strong background in math, statistics, and machine learning
  • Programming with Python, R, or Julia
  • Proficiency in visualization tools: Tableau, Power BI
  • Experience with ML libraries (scikit-learn, TensorFlow, PyTorch)
  • Ability to communicate business-ready recommendations

Learn more about how predictive modeling drives business impact with our Advanced Analytics Solutions.

Data Engineer vs. Data Scientist: Key Differences

Here’s a side-by-side comparison of the data engineer vs data scientist roles:

FeatureData EngineerData Scientist
FocusBuild and maintain data infrastructureAnalyze data, build predictive models
SkillsProgramming (Python, SQL), ETL, cloud platforms, big data techStatistical analysis, ML, data viz, business acumen
ToolsSpark, Hadoop, SQL, Airflow, Kafka, ClickHouse CloudPython, R, Tableau, scikit-learn, TensorFlow, AutoML
GoalDeliver robust, reliable, and secure dataExtract insights and drive business strategy

While different, these roles are deeply interconnected. Data engineers ensure high-quality foundations; data scientists transform that foundation into insights.

Which Career Path Is Right for You?

If you’re debating between becoming a data engineer or a data scientist, consider:

  • Do you enjoy building systems, solving infrastructure problems, and optimizing performance? → Data Engineering may be your fit.
  • Do you prefer analyzing data, applying models, and storytelling with insights? → Data Science might be your direction.

Both paths are in high demand and offer strong growth opportunities. For organizations, the best results come when both roles collaborate closely.

(Curious how we guide talent strategy? Read our Data Careers Guide for insights.)

What Comes First: Data Engineer or Data Science?

The reality is — data engineering usually comes first. Without well-structured, accessible data, even the most advanced science and modeling will fail.

Think of it like constructing a building: you wouldn’t hire interior designers before architects and builders lay a solid foundation. Similarly, no data science project succeeds without a trusted, scalable infrastructure in place.

👉 This is why many companies start by investing in Modern Data Infrastructure before scaling analytics initiatives.

Future of Data Roles

Both roles are evolving with emerging technologies:

  • Data Engineers: Focus on cloud-native architectures, data governance, and security
  • Data Scientists: Sharpen expertise in deep learning, natural language processing (NLP), and explainable AI

Automation tools are accelerating workflows, but the demand for human expertise in designing systems and interpreting results will only grow.

Organizations that foster close collaboration between these two functions will be best positioned to leverage AI and data for competitive advantage.

For more perspectives on where business data is heading, check out our recent post on The Future of AI in Business.

Forward Thinking

The distinction between data engineers and data scientists isn’t about competition — it’s about collaboration. Together, they form the backbone of any modern data team.

Businesses that want to succeed in the data economy must invest in both infrastructure (engineering) and analytics (science). For individuals, both career paths offer rewarding opportunities to shape the future of how organizations harness information.


Frequently Asked Questions (FAQ)

Is data engineering harder than data science?

Not necessarily. Data engineering leans heavily on programming, system design, and cloud infrastructure, while data science requires a deep understanding of math, statistics, and modeling. The difficulty depends on your background and interests.

Who earns more: Data engineer or data scientist?

Salaries vary by industry and experience, but historically data scientists earn slightly higher median salaries due to their specialization in machine learning and AI. However, demand for data engineers is rising quickly as companies recognize the importance of solid infrastructure.

Do you need data engineering before data science?

Yes. Without a reliable and scalable data infrastructure, data scientists cannot work effectively. That’s why many organizations invest in data engineering first, then scale into analytics and advanced modeling.

Which career path should I choose?

If you enjoy building systems, working with databases, and solving infrastructure problems, pursue data engineering. If you’re more interested in analytics, machine learning, and storytelling with data, consider data science. Both fields are in high demand.

Are data engineers and data scientists replacing each other?

No — these are complementary roles. Data engineers build the foundation, while data scientists analyze and interpret the data. Together, they drive data-driven decision-making.

Top 8 Data Integration Challenges in Education and How to Solve Them

Top 8 Data Integration Challenges in Education and How to Solve Them

Schools and universities run on many systems—SIS, LMS, assessments, finance, alumni, and clinical programs. Without data integration, insight stays trapped, reports conflict, and decisions slow down. With the right data integration plan, these systems tell one story about students, programs, and resources.

Education analytics solutionsData integration servicesContact us

Why Data Integration Matters

Education leaders need timely, trusted information to act. When data integration connects core systems, teams move faster and serve students better.

• One source of truth reduces rework and debate

• Real-time context guides interventions and resource allocation

• Consistent definitions make cross-campus reporting possible

1. Siloed Systems

Grades in the SIS, coursework in the LMS, tests in separate apps, and alumni data elsewhere create a fractured view.

• Standard APIs and formats align the language of your data

• A warehouse or lake becomes the trusted hub for analytics

• Role-aware access keeps the right people informed

See our data engineering approach

2. Inconsistent Data, Conflicting Reports

Different definitions for attendance, course completion, or program status lead to “dueling dashboards.” Establishing common definitions, validation rules, and routine data quality checks aligns reports across campuses and terms. Governance gives everyone confidence in what the data means.

• Shared definitions and validation rules end report drift

• Routine quality checks catch errors before they spread

• Data lineage explains where numbers come from

3. Slow Financial Visibility

Funding, grants, tuition, purchasing, and budgeting often sit in separate systems, making reconciliation slow.

• Connect accounting, grants, procurement, and planning for one finance model

• Tie spend to objectives and refresh KPIs quickly

• Streamline audits with consistent structures and controls

Data analytics consulting

4. Surveys Without Context

Student, parent, faculty, and alumni surveys hold valuable signals, but mixed tools and formats make comparisons hard. Standardize surveys and join responses to SIS/LMS data. Suddenly, a shift in satisfaction aligns with schedule changes, program redesigns, or resource gaps, and action is clearer.

• Standardize instruments so results compare term to term

• Join surveys to SIS/LMS data to see cause and effect

• Track changes over time to inform program design

5. Clinical Programs Kept Apart

Nursing, medicine, and allied health track EHRs, clinic software, and simulation data separately from academics. Secure connectors merge clinical hours, competencies, and outcomes with the academic record. Education data integration shortens accreditation reporting and gives faculty a complete picture of progress.

• Secure connectors sync clinical hours, competencies, and outcomes

• Unified records show skills, progress, and accreditation evidence

• Faculty gain a complete view of each learner

6. Manual Work and Spreadsheet Stitching

Exports, copy‑paste, and one‑off scripts drain time and add risk. The payoff is faster cycles and fewer late-night fixes.

• Managed pipelines to replace ad hoc work

• Change data capture keeps apps current where freshness matters

• Documented schedules and runbooks reduce midnight fixes

7. Security and Governance Gaps

As systems connect, risks rise. Define stewards, publish data dictionaries, and track lineage from source to dashboard. Encrypt sensitive data, enforce least‑privilege access, and audit regularly. With governance embedded, integration becomes safe and repeatable rather than fragile.

• Assign stewards and publish a data dictionary

• Encrypt sensitive fields and enforce least‑privilege access

• Audit regularly; track lineage from source to dashboard

8. Choosing an Approach to Data Integration

Match patterns to needs rather than forcing a one‑size‑fits‑all solution.

ETL to WarehouseCurated reporting, historical trendsClean, conformed data
CDC/Event StreamsOperational syncs, near real-timeLow-latency updates
Data VirtualizationFast access across sourcesMinimal data movement

• Pilot a narrow use case, prove value, then scale

• Balance freshness, complexity, and cost

• Reuse standards and components across projects

How to Get Started with Data Integration

Map today’s flows, agree on shared definitions, and pick one high‑value pilot—unify SIS and LMS for early alerts, or connect finance for grant tracking. Build with maintainability in mind, train the team, and expand to the next priority. When you’re ready, we’re here to help.

Explore education solutionsStart an integration planTalk to us

Why Your ERP, CRM, and BI Tools Need Better Data Integration

Why Your ERP, CRM, and BI Tools Need Better Data Integration

Most businesses run on three core systems: ERP for operations, CRM for customers, and BI for insights. Without ERP, CRM, and BI Data Integration, data gets trapped in silos and critical context is lost. Effective data integration connects these systems so information flows in real time, reducing manual work and errors. When your tools share a single source of truth, teams make faster, smarter decisions and deliver a smoother customer experience. This is how you turn disconnected activity into coordinated growth.

Ready to break down those walls? Explore our data integration services to see how we can help.

The Real Cost of Data Silos

Picture this: Your sales team closes a big deal in the CRM, but your warehouse doesn’t know about it until someone manually updates the ERP. Meanwhile, your BI dashboard shows last week’s numbers because it can’t pull real-time data from either system.

Sound familiar? Here’s what data silos are costing you:

• Duplicate work and manual data entry

• Inconsistent reports across departments

• Delayed decisions based on outdated information

• Frustrated teams working with incomplete data

• Missed opportunities to serve customers better

This fragmented approach doesn’t just waste time—it actively hurts your ability to compete and grow.

Operational Excellence: When Data Integration Works Together

Imagine a different scenario. A customer places an order through your sales team, and instantly:

• Inventory levels update automatically in your ERP

• Production schedules adjust if needed

• Shipping timelines appear in real-time

• Customer service gets full order visibility

• Finance sees revenue impact immediately

This isn’t wishful thinking—it’s what happens when your systems are properly integrated. The result? Smoother operations, fewer errors, and teams that can focus on strategy instead of data entry.

Learn more about our data engineering solutions that make this possible.

Customer Relationships: The 360-Degree View

When your CRM and ERP share data, something powerful happens—you see the complete customer story:

CRM DataERP DataCombined Insight
Sales interactionsOrder historyCustomer buying patterns
Marketing campaignsShipping detailsCampaign effectiveness
Service ticketsPayment historyCustomer satisfaction drivers
Lead sourcesProduct preferencesBest acquisition channels

This unified view lets your team:

• Personalize every customer interaction

• Predict what customers need before they ask

• Identify upselling and cross-selling opportunities

• Resolve issues faster with complete context

Strategic Decisions: BI That Actually Works

Your BI tools are only as good as the data they can access. When connected to integrated ERP and CRM data, your dashboards transform from pretty charts into strategic weapons:

• Track real-time KPIs across all departments

• Spot trends before your competitors do

• Measure the true impact of marketing campaigns

• Understand which customers drive the most profit

• Make decisions based on complete, accurate data

For example, integrated data might reveal that customers acquired through social media campaigns have 40% higher lifetime value—but only if they purchase within their first 30 days. That’s the kind of insight that drives real business growth.

Making Data Integration Happen

Getting your systems to work together doesn’t have to be overwhelming. Here’s how successful organizations approach it:

Assessment & Planning

Start by mapping your current data flows and identifying the biggest pain points. Where are teams spending the most time on manual work? Which decisions are delayed by missing data?

Choose Your Integration Approach

  • Native integrations: Use built-in connections when available
  • Middleware solutions: Deploy integration platforms for complex scenarios
  • Modern data platforms: Leverage cloud-based tools for scalability

Focus on Business Value

Don’t integrate everything at once. Start with the connections that will have the biggest impact on your operations, customer experience, or decision-making.

Need help getting started? Contact our team to discuss your integration strategy.

The Bottom Line for Data Integration

Breaking down data silos isn’t just about technology—it’s about unlocking your organization’s potential. When your ERP, CRM, and BI tools work together, you get:

  • Faster operations with automated data flows
  • Happier customers through personalized experiences
  • Smarter decisions based on complete information
  • Competitive advantage through data-driven insights

The question isn’t whether you can afford to integrate your systems—it’s whether you can afford not to. Start your integration journey today and discover what your data can really do.

Transform Patient Care: Powerful Healthcare Analytics for Results

Transform Patient Care: Powerful Healthcare Analytics for Results

How Data Analytics Improves Patient Outcomes

In today’s healthcare world, data is transforming the way providers deliver care. At Arc Analytics, we see every day how smart use of healthcare analytics leads to better patient outcomes, more efficient operations, and sharper clinical decisions. With the right tools and expertise, healthcare organizations can turn raw data into actionable insights that truly make a difference.

What Is Healthcare Data Analytics?

Healthcare data analytics means using clinical, financial, and operational data to improve care and efficiency. With the rise of electronic health records (EHRs), medical imaging, claims, and patient surveys, healthcare organizations generate more data than ever. When analyzed well, this data reveals patterns that help providers make better decisions and improve patient care. Modern analytics platforms, like Qlik Answers, make it easier for teams to ask complex questions and get clear, actionable answers from their data.

Learn more about our healthcare analytics services.

Types of Analytics in Healthcare

TypeWhat It DoesQlik Solution
DescriptiveLooks at past data to understand what happenedQlik Answers
DiagnosticExplains why certain outcomes occurredQlik Answers
PredictiveForecasts future outcomes based on trendsQlik Predict
PrescriptiveRecommends actions to optimize resultsQlik Automate

Healthcare analytics platforms like Qlik make it possible to move seamlessly from understanding what happened, to why it happened, to what will happen next—and what you should do about it.

How Data Analytics Transforms Patient Care

Early Disease Detection & Risk Prediction

Analytics can spot patients at risk for chronic conditions—like diabetes or heart disease—before symptoms appear. Predictive models, powered by tools such as Qlik Predict, flag high-risk individuals, so providers can act early and prevent complications. This proactive approach helps healthcare teams shift from reactive care to prevention, improving long-term outcomes and reducing costs.

Personalized Treatment Planning

Every patient is unique. By analyzing outcomes from similar cases, providers can tailor treatments to each person’s needs, improving results and reducing side effects. With Qlik Answers, clinicians can quickly compare patient histories and treatment responses, making it easier to design care plans that are truly personalized. This data-driven approach ensures that each patient receives the most effective therapies based on real-world evidence.

Reducing Hospital Readmissions

Unplanned readmissions are costly and often preventable. Analytics helps identify patients likely to return, so care teams can offer extra support, better discharge planning, and follow-up care. Qlik Automate can streamline these processes by triggering automated alerts and workflows for at-risk patients, ensuring that no one falls through the cracks and that interventions happen at the right time.

Case Study: Recovery Center Patient Lifecycle Demo

We recently built a demo app that shows the power of integrated healthcare analytics. This tool connects data from a recovery center—Google Analytics, patient surveys, and clinical records—to map the entire patient journey.

Tracking the Patient Journey

From first contact to discharge, our demo visualizes each step, helping administrators spot bottlenecks and improve care delivery. By integrating multiple data sources, the application provides a 360-degree view of each patient’s experience, making it easier to identify where improvements can be made.

Real-Time Alerts for High-Risk Patients

Using Qlik’s alerting and GeoAnalytics, the app sends real-time notifications when a patient is at risk, even mapping emergency contacts nearby for rapid support. Qlik Automate ensures these alerts are delivered instantly to the right care team members, so action can be taken without delay.

Strengthening Support Networks

By connecting patients with their support networks, facilities can intervene quickly and improve long-term outcomes. The demo leverages Qlik’s advanced mapping and automation features to ensure that support is always within reach, especially during critical moments in a patient’s recovery.

See how our analytics solutions work in action.

Implementing Data Analytics in Healthcare

Building the Right Infrastructure

A strong analytics program needs secure storage, real-time processing, and tools to connect different systems. Qlik’s cloud-based solutions make it easy to scale your analytics infrastructure as your needs grow, while maintaining security and compliance. Explore our data engineering services.

Scalable Governance

We use a scalable governance approach, so your data quality, security, and compliance grow with your analytics capabilities. Qlik’s governance features help ensure that sensitive health data is protected and that analytics remain trustworthy as your organization evolves.

Overcoming Challenges

Data quality, legacy systems, and privacy are real hurdles. Our team helps you plan, integrate, and train for success, leveraging Qlik’s integration and automation tools to simplify even the most complex environments.

The Future: AI & Advanced Analytics

The next wave in healthcare analytics is AI and machine learning—tools that find complex patterns and predict outcomes with new accuracy. With Qlik Predict and open-source platforms, we help organizations prepare for this future, implementing scalable solutions that keep you at the forefront of healthcare innovation.

Why Arc Analytics?

Every healthcare organization is different. We combine technical skill with healthcare know-how to deliver custom analytics that fit your needs and drive real improvements in patient care. Our team works closely with yours to ensure that Qlik’s powerful features—like Qlik Answers, Qlik Predict, and Qlik Automate—are fully leveraged for your unique challenges.

Ready to see what data can do for your patients?


Contact Arc Analytics today to learn how our solutions can help you improve outcomes, boost efficiency, and deliver more personalized care.

Signs Your Organization Needs a Data Consultant Now

Signs Your Organization Needs a Data Consultant Now

When Is It Time to Call a Data Consultant?

Every organization wants to turn data into a competitive edge, but for many in healthcare, education, state government, and small to medium-sized businesses (SMBs), the path isn’t always clear – you’re partnered with a strong data consultant. If you’re struggling to connect the dots between your data and real-world results, you’re not alone. Recognizing the signs early—and knowing where to get help—can make all the difference.

If you’re in healthcare, you might notice that patient data is scattered across EHRs, billing, and departmental systems, making it tough to see the full picture. In education, student information, learning management, and alumni data often live in silos, blocking a unified view of student progress. State governments face similar hurdles with legacy systems and fragmented agency data. SMBs, meanwhile, often collect sales and marketing data but lack the resources to turn it into actionable insights.

Don’t let data challenges hold you back. Explore our data analytics consulting services or contact us to see how a data consultant can help.

Industry-Specific Data Challenges that a Data Consultant Solves

Healthcare

Healthcare organizations often struggle to translate raw data into better patient care. Disconnected systems, inconsistent records, and a lack of clear data strategy can slow progress. If your IT team is overwhelmed or dashboards aren’t delivering the insights you need, it’s time to consider outside expertise. Learn more about our healthcare data analytics services.

Education

Educational institutions face challenges connecting student data to learning outcomes. Siloed systems, unreliable attendance or grading data, and no clear roadmap for data-driven teaching can hinder student success. If your team can’t keep up or you’re not spotting at-risk students early, a data consultant can help. See our education analytics solutions.

State Government

State agencies often find that citizen data isn’t driving better services due to fragmented legacy systems and inconsistent reporting. Without a unified data strategy, it’s hard to inform policy or improve programs. If your IT team is stretched thin or your reports aren’t actionable, it’s time to act. Discover our government analytics expertise.

Small & Medium Businesses (SMBs)

SMBs may collect plenty of data but struggle to make sense of it. Data scattered across apps, duplicate or missing customer info, and limited IT resources can make it hard to compete. If you can’t see sales trends or predict customer needs, you’re missing out on growth opportunities. Check out our data analytics services for SMBs.

Common Data Challenges: At a Glance

ChallengeHealthcareEducationState GovSMBs
Data silos & fragmentation
Poor data quality
No clear data strategy
Overwhelmed IT teams
Weak data visualization/reporting
Not using advanced analytics

What a Data Consultant Brings

A data consultant offers a fresh perspective, specialized industry knowledge, and the technical skills to solve your toughest data problems. They can help you break down silos, improve data quality, and build a roadmap for success. Plus, they empower your team with best practices and keep you up to date with the latest tools and trends.

Ready to take the next step? Meet our team or explore our full range of services.

Conclusion: Invest in Data-Driven Success

Don’t let data challenges slow you down. Whether you’re in healthcare, education, government, or running a growing business, the right consultant can help you unlock the full potential of your data. Contact us today to start your journey toward smarter decisions and better results.

How Qlik Cloud Improves Public Safety Outcomes

How Qlik Cloud Improves Public Safety Outcomes

In the complex and critical realm of public safety, timely and insightful data is the bedrock of effective decision-making. From anticipating potential threats to optimizing emergency responses, the ability to rapidly analyze vast amounts of information can quite literally save lives and improve community well-being. This is where Qlik Cloud Analytics steps in, transforming raw data into actionable intelligence that empowers school systems, law enforcement, emergency services, and community leaders.

At Arc Analytics, we’ve seen firsthand how integrating diverse data sets within Qlik Cloud can create a truly powerful picture of public safety dynamics. To illustrate this, we’ve created a unique demonstration that weaves together seemingly disparate data points, providing an unprecedented level of insight into community safety.

Unveiling Insights: A Multi-Layered Look at Community Safety

This demonstration uses Qlik Cloud to visualize complex public safety scenarios, combining publicly available data from across Florida. Almost every layer of data shown is publicly available, but it also contains critical information for understanding safety concerns. These datasets include:

  • Florida School Grading System (1999-2023): One layer presents the entire state of Florida’s public school grading system from 1999 to 2023. You can see the grades and precise locations of schools across the state, allowing you to gauge educational performance visually.
  • Pinellas County Crime Data (Last 10 Years): Superimposed on this, another layer displays Pinellas County crime data for the last decade. This isn’t just dots on a map; it’s a dynamic heat map that visually represents the severity of crime, indicating when and where incidents occurred. This gives a visceral sense of criminal activity patterns.
  • Pinellas County Sex Offender Locations: Perhaps one of the most impactful layers shows the locations of all registered sex offenders in Pinellas County. What makes this particularly compelling is the interactive element: when you hover your cursor over an offender’s location, their mugshot instantly appears.
  • Pinellas County Bus Routes: A crucial layer reveals the bus routes of Pinellas County. By toggling this on, you can see the lines on the map, allowing for a visual correlation between public transportation arteries and areas with higher crime rates. This insight can be vital for understanding movement patterns and potential vulnerabilities.
  • Pinellas County SNAP Locations: An open data source that shows each eligible location in Florida that accepts SNAP benefits for the program recipient.

The Critical Role of Monitoring Public School Education in Community Safety

Tracking the educational performance of public schools is vital for understanding community well-being and long-term public safety. By monitoring school grades and trends, stakeholders can identify areas needing support, allocate resources more effectively, and address underlying issues that may impact youth outcomes. Data-driven insights into education not only help improve academic achievement but also contribute to safer, more resilient communities by empowering early intervention and informed decision-making.

Twenty years of trending Florida School Accountability Reports

Optimizing Resource Allocation

Public safety agencies often face the challenge of deploying limited resources to cover vast and dynamic areas. Qlik Cloud provides the analytical horsepower to make these critical decisions with precision:

  • Dynamic Deployment Strategies: By integrating crime data, population density, historical incident reports, and even real-time events, Qlik Cloud allows commanders to visualize hotspots and allocate police patrols, fire services, or emergency medical teams more effectively.
  • Staffing Optimization: Analyzing call volumes, response times, and incident types helps agencies determine optimal staffing levels and shift schedules, ensuring adequate coverage where and when it’s most needed.
  • Infrastructure Planning: Understanding the correlation between infrastructure (like bus routes, as seen in the demo) and incident patterns can inform decisions about where to increase surveillance, improve lighting, or adjust public transport schedules to enhance safety.

Proactive Crime Prevention

Moving beyond reactive responses, Qlik Cloud empowers agencies to adopt proactive crime prevention strategies:

  • Pattern Recognition: The ability to layer data like school locations, crime hotspots, and known offender residences helps identify subtle patterns that might indicate elevated risks in certain areas or times. This allows for targeted community engagement or increased presence.
  • Intervention Program Effectiveness: By tracking the locations and characteristics of crime, agencies can evaluate the effectiveness of community programs, youth outreach, or neighborhood watch initiatives. Qlik can show if interventions in specific areas are truly leading to a reduction in incidents.
  • Risk Area Identification: The demo’s ability to show sex offender locations relative to schools and homes is a prime example of how Qlik can highlight vulnerable areas, allowing for informed alerts to school officials, parents, and community members.
Overview of crime data within Pinellas County

Enhancing Emergency Response and Coordination

In emergencies, every second counts. Qlik Cloud facilitates faster, more informed responses:

  • Real-time Situational Awareness: By integrating live feeds from sensors, traffic cameras, and dispatch systems, Qlik Cloud can provide first responders with real-time dashboards showing the evolving situation, allowing for quicker decisions on routes, hazards, and necessary resources.
  • Incident Command Support: During large-scale emergencies, Qlik applications can consolidate information from multiple agencies (police, fire, medical, public works) into a single, intuitive view, enhancing coordination and resource deployment.
  • Post-Incident Analysis: After an event, Qlik Cloud allows for thorough analysis of response times, resource utilization, and outcomes, providing invaluable lessons for future emergency planning and training.

This powerful mapping tool allows users to toggle each layer on and off, revealing a dynamic narrative of public safety. For instance, a school district official or a concerned parent can easily visualize the proximity of a registered sex offender to their nearest school or even their home location. This immediate, visual insight provides a critical understanding of potential risks that text-based reports simply cannot convey.

The real underlying point of this demo, even though it utilizes publicly sourced data to paint a vivid picture, is this: by providing more and more data sources with deeper aspects of knowledge, you can enable better decision-making no matter your position, role, or concern for public safety. It’s about empowering everyone with the insights they need.

Geometry lines of 20-minute walking distance to school, sex offenders (with mugshot), and SNAP Benefit in downtown St Petersburg

The Power of Data Collaboration for a Safer Tomorrow

The true strength of Qlik Cloud in public safety lies not just in its individual analytical capabilities but in its ability to foster data collaboration. Public safety is rarely the responsibility of a single entity. It involves complex interactions between law enforcement, fire departments, emergency medical services, local government, schools, and community organizations.

Qlik Cloud provides a unified, secure platform where these diverse stakeholders can share, visualize, and collaborate on critical data. This breaks down traditional information silos, enabling:

  • Cross-Agency Insights: Police departments can share crime trends with school districts, allowing for joint safety initiatives. Emergency services can share incident data with urban planners to identify areas needing better infrastructure.
  • Informed Community Engagement: By making relevant, aggregated data accessible to the public, as demonstrated by the school performance aspects of this demonstration, it fosters greater community trust and encourages informed citizen participation in safety efforts.
  • Proactive Policy Making: Legislators and city planners can use these comprehensive datasets to inform policy decisions, allocate budgets, and design safer communities based on clear evidence rather than assumptions.

Partnering for Public Safety with Arc Analytics

At Arc Analytics, we are dedicated to helping public safety agencies and communities harness the full potential of Qlik Cloud. This example is just one powerful, actionable insight that can be gleaned when data is integrated and visualized effectively. We understand the sensitive nature of public safety data and ensure that our solutions adhere to the highest standards of security and compliance.

We work closely with organizations to:

  • Integrate Disparate Data Sources: Bringing together information from police records, school systems, public transportation, and other critical databases.
  • Develop Custom Analytical Applications: Building tailored Qlik solutions that address your specific public safety challenges and objectives.
  • Provide Expert Implementation and Training: Ensuring your teams are proficient in using Qlik Cloud to drive continuous improvements in public safety.

By transforming raw data into clear, actionable intelligence, Qlik Cloud Analytics, supported by Arc Analytics’ expertise, empowers public safety professionals and concerned citizens alike to make better decisions, protect communities, and build a safer future for everyone.

Ready to see the power of Qlik Cloud Analytics in action? Schedule a full demo with our team today and discover how data-driven insights can transform public safety in your community.

Adding Tabler Icons to Qlik Dashboards

Adding Tabler Icons to Qlik Dashboards

Qlik doesn’t really allow for icons. When creating dashboards in Qlik Cloud, it can be very helpful to add icons to spruce up KPIs, titles, and tables. There are hundreds of use cases for adding some visual flair using icons, but it can be cumbersome to add icons to objects in Qlik because there are very few built-in icon options.

So, how can we go about adding some icons to our dashboards in an easy and expressive way?

We can use a font! But wait, we’re talking about icons, not text. So how will a font help us?
It turns out that fonts can pack in far more than just standard characters like letters, numbers, and punctuation. One example of a “supercharged” font is Tabler Icons.

Tabler Icons is an open source project that bundles thousands of beautiful icons into multiple formats that you can use freely in your web projects. One such format is a webfont, specifically .tff which is a TrueType font type.

Tabler Icons Website

How can we use this font in Qlik?

We’ll add it to a custom Qlik theme and choose icons in our dashboard using variable expansion with a parameter.

Don’t worry if this doesn’t quite make sense yet! Let’s go through each step now.

Steps to set up

Download Tabler Icon webfont.

  • We can find the tabler-icons.ttf font file in the Tabler Icons Webfont package on the NPM website:

tabler icons webfont

Create a new or open an existing Qlik theme.

  • If you don’t already have a custom theme to add this font to, go ahead and create one based on the instructions laid out on the Qlik Help website. You can also look online for a Qlik theme generator to help get you started.
Add tabler-icons.ttf to the Qlik theme folder.

  • Move the tabler-icons.ttf file to your custom Qlik theme folder. It should look similar to this:
tabler icons webfont
Add @font-face to the theme CSS file.

  • Open your theme’s .css file and add this snippet at the top:
    @font-face {
      font-family: "Tabler Icons";
      src: url("tabler-icons.ttf") format("truetype");
      font-weight: normal;
      font-style: normal;
    }
  • Save and close the file.
Add or modify the fontFamily property in the theme JSON file:

  • Open your theme’s .json file and add this snippet near the top:
    "fontFamily": "Open Sans, Tabler Icons, sans-serif"
  • Here’s an example of what it should look like:
ℹ️ Note that in our screenshot above, my snippet includes the Open Sans font, as I want for that to be the primary font for normal characters like letters and numbers. You can replace that with any of the default Qlik Cloud font options:

Upload file to Qlik Cloud or Qlik Sense Client-Managed.

  • To add your custom theme to Qlik, you must first save the theme folder as a ZIP file.
  • How to upload your theme to Qlik Cloud
    • You can follow this Qlik Help page guide on how to upload your theme ZIP file to Qlik Cloud.
    • After uploading the theme ZIP file, you should see the theme in the Admin Console:
  • How to upload your theme to Client-Managed
  • Open a Qlik app and add this Qlik script to the Data Load Editor in the Main section:
    Set GetTablerIcon = Chr(Num#(‘$1’, ‘(hex)’));
  • Your Main script should look like this:
How to use icons in a dashboard

  • In the box that appears, click on the hex value to copy it:
  • Go to the app Sheet view and switch the app theme to use our uploaded theme.
In your Qlik app, select any Qlik object, and then choose an expression property.

  • For example, you can create or select a bar chart object and then open the expression editor for the Title expression.
In the property’s expression editor, we’ll use dollar-sign expansion with our GetTablerIcon variable and use the our copied Tabler Icon hex code as the parameter.

  • Make this the expression:
    =$(GetTablerIcon(ea59)) & ' Sales by County'
– Then select the Apply button to save that.

You should now see your icon in the chart title!

If your icon doesn’t appear or you see a placeholder character in the chart title where our icon should be, you probably just need to update the font property.

  • To do this, go to the chart styling tab:
  • Find the font property we want to change (Title in this example) and then choose the option that includes Tabler Icons:
ℹ️ Note that if you want to “pair” the Tabler Icons font with a primary font that regular characters will use, refer back to step 5.

Summary

You should now be able to use Tabler Icons anywhere in a Qlik dashboard that supports text expressions and changing the font!

That should get you very far.Try changing the font color and size to see how the icons scale very well and can be recolored just like text.

Building A Robust Data-Driven Culture

Building A Robust Data-Driven Culture

In today’s fiercely competitive business landscape, data has moved beyond the realm of simple record-keeping to become the very engine of strategic advantage. Organizations that can effectively harness the insights hidden within their data streams are demonstrably more agile, innovative, and ultimately, more successful. However, the journey towards becoming a truly data-driven organization is not merely about deploying sophisticated analytics platforms. It requires a fundamental shift in culture, a deep-seated commitment that permeates every level of the organization, from the executive suite to individual contributors. This comprehensive guide will navigate the essential steps involved in cultivating a robust data-driven culture, underscoring its profound benefits and illuminating the critical role of people, processes, and technology in this transformative endeavor.

Laying the Foundation: Identifying Key Pain Points and Opportunities

The initial and foundational stage in building a data-driven culture involves a collaborative and thorough effort to pinpoint the specific areas within the organization where data can exert the most significant positive influence. This process extends beyond simply identifying obvious operational bottlenecks or areas of inefficiency. It necessitates engaging stakeholders from across all departments – sales, marketing, operations, finance, customer service, and beyond – to understand their unique challenges and the questions they struggle to answer with existing information. For instance, the marketing team might grapple with understanding which campaigns yield the highest return on investment, while the sales team might lack clarity on the characteristics of their most successful leads. Operations could be struggling with unpredictable supply chain disruptions, and customer service might be reactive rather than proactively addressing potential issues.

Furthermore, the focus should not solely be on rectifying problems. A truly data-driven mindset actively seeks opportunities where data can fuel innovation, enhance the customer experience in meaningful ways through personalization, optimize the allocation of resources across various initiatives, and even identify entirely new business models. By involving a diverse range of perspectives, organizations can uncover a broader spectrum of both pain points ripe for data-driven solutions and untapped opportunities waiting to be unlocked. Prioritizing these identified areas based on their potential impact on key business objectives and the practical feasibility of implementing data-driven solutions will ensure that initial efforts are strategically aligned and deliver tangible value, fostering early buy-in and demonstrating the power of a data-centric approach.

Empowering Solutions: Leveraging Data to Solve Problems and Drive Innovation

Once the key pain points and promising opportunities have been identified, the next crucial step involves strategically applying various methodologies of data analysis to extract meaningful insights and drive tangible improvements. This encompasses a spectrum of analytical techniques, each suited to answering different types of questions. Descriptive analysis provides a historical overview of what has occurred, offering valuable context. Diagnostic analysis delves deeper, seeking to understand the underlying reasons and correlations behind observed trends. Predictive analysis leverages historical data and statistical modeling to forecast future outcomes and anticipate potential challenges or opportunities. Finally, prescriptive analysis goes beyond prediction by recommending specific actions and interventions to achieve desired results.

For example, if a sales team is struggling with high customer churn, diagnostic analysis might reveal specific customer segments or interaction patterns that are strong indicators of attrition. Predictive modeling could then forecast which current customers are most likely to churn, allowing for proactive intervention. Prescriptive analytics could even recommend targeted strategies, such as personalized offers or enhanced support, to mitigate this risk. Similarly, in product development, analyzing customer feedback data (both structured and unstructured) can provide invaluable insights into unmet needs, guiding the creation of innovative new features or products. The process of leveraging data for problem-solving and innovation is iterative, requiring a willingness to formulate hypotheses, rigorously test them against available data, and refine analytical approaches based on the evidence uncovered. Embracing a culture of experimentation, including A/B testing different data-driven strategies, is essential for validating their effectiveness and fostering a continuous cycle of improvement and learning.

Cultivating Data Fluency: The Cornerstone of a Data-Driven Culture

The successful and sustainable embedding of a data-driven culture within an organization fundamentally relies on cultivating a high degree of data fluency across all levels of its workforce. This does not imply that every employee needs to become a data scientist or possess advanced statistical expertise. Instead, it signifies fostering a widespread comfort level in working with data, enabling individuals to understand basic data concepts, interpret visualizations, formulate relevant questions based on data, and confidently utilize data-backed insights in their daily decision-making processes. The specific levels of data literacy required will naturally vary depending on individual roles and responsibilities. However, a foundational understanding of data privacy, ethical data usage, and the ability to critically evaluate data sources are essential for everyone.

Organizations can adopt a multi-pronged approach to elevate data literacy. This includes implementing comprehensive training programs tailored to different skill levels and roles, creating easily accessible internal resources such as data glossaries, style guides for data interpretation, and case studies showcasing successful data application. Mentorship programs that pair data experts with colleagues seeking to enhance their skills can also be highly effective. A critical element is ensuring that data is presented in an accessible and understandable manner for non-technical users, often through user-friendly dashboards and intuitive data visualization tools that abstract away unnecessary complexity. Leadership plays a pivotal role in championing data literacy initiatives by actively demonstrating the value of data in their own decision-making processes, visibly supporting training efforts, and fostering an environment where asking data-related questions is not only encouraged but expected. Ultimately, nurturing a culture of intellectual curiosity, where employees are empowered to explore data and seek evidence-based answers, will solidify data fluency as a core organizational competency and drive widespread adoption of data-driven practices.

Equipping Your Team: Choosing and Implementing the Right Data Tools

The strategic selection and effective implementation of appropriate data tools are critical enablers of a data-driven culture. The right tools can democratize access to data, empower users to perform their own analyses, and streamline the process of generating insights. When evaluating potential data tools and platforms, organizations should consider several key criteria. Usability for a diverse range of users, regardless of their technical proficiency, is paramount. Seamless integration capabilities with existing systems and data sources are essential to break down silos and ensure data accessibility. Scalability to handle growing data volumes and evolving analytical needs is crucial for long-term viability. Robust security features are non-negotiable to protect sensitive data and ensure compliance with relevant regulations. Finally, the overall cost-effectiveness of the tools, considering both initial investment and ongoing maintenance, must be carefully evaluated.

Platforms like Qlik Cloud offer a powerful and versatile suite of capabilities designed to foster a data-driven environment. Their intuitive and interactive data visualization tools empower users to create insightful dashboards and reports with minimal technical expertise, while their robust data integration features facilitate the connection and harmonization of data from disparate sources. Features such as collaborative analytics enable teams to work together on data exploration and insight generation, and embedded analytics capabilities allow for the seamless integration of data insights into existing applications and workflows. However, simply selecting the right tools is only part of the equation. Successful adoption necessitates a well-planned implementation strategy, comprehensive training programs to ensure users can effectively leverage the tools’ features, and ongoing support to address any technical challenges or user questions. Furthermore, establishing clear data governance policies and procedures is essential to ensure the quality, accuracy, and trustworthiness of the data being utilized within these tools, fostering confidence and driving adoption.

Conclusion: Embracing Data as the Engine of Success

In conclusion, the journey towards building a truly robust and impactful data-driven culture requires a holistic and sustained effort that encompasses people, processes, and technology. By systematically identifying key pain points and opportunities, empowering data-driven solutions, cultivating widespread data fluency across the organization, strategically selecting and implementing the right data tools, and diligently sustaining the momentum through continuous learning and leadership commitment, organizations can transform data from a latent asset into the very engine of their success, driving innovation, enhancing efficiency, fostering deeper customer understanding, and ultimately achieving a significant and sustainable competitive advantage in today’s data-rich world.

What to Expect From a Data Analytics Consulting Partner

What to Expect From a Data Analytics Consulting Partner

Navigating the world of data analytics can feel like trying to decipher an ancient language. You know the potential is there – those hidden insights that can propel your business forward – but unlocking them often requires a skilled guide. That’s where a data analytics consulting partner comes in. But just like choosing the right travel companion, finding the right partner can make all the difference between a smooth journey and a frustrating detour. This isn’t just about someone setting up a dashboard and calling it a day. A true partner becomes an extension of your team, deeply understanding your unique challenges and working collaboratively to achieve your specific goals. So, what should you really expect from this kind of relationship?

What Should You Be Receiving? It’s More Than Just Deliverables.

When you engage a data analytics consulting partner, you’re not just buying a service; you’re investing in expertise and a collaborative relationship. Here are some key things you should expect to receive:
  • Flexibility That Fits Your Needs: Forget rigid contracts and pre-packaged solutions. A good partner understands that your needs can evolve. Expect pre-authorized hours that provide budget control while allowing for necessary work to be completed. Think of it as setting a clear boundary, like saying, “Let’s scope this project within 40 hours, and if we need more, we’ll talk.” This demonstrates respect for your budget and ensures transparency. Furthermore, look for flexible service level agreements (SLAs). These shouldn’t be one-size-fits-all. A partner should be willing to tailor SLAs – perhaps a standard four-hour response time for typical requests, which might even adjust to a quicker 30-minute response during critical periods – all tied to clearly defined scopes of work and agreed-upon hourly rates. This adaptability shows they’re truly invested in supporting your business rhythm.
  • A Consistent and Dedicated Point of Contact: Imagine having to explain your project to a new person every time you reach out. Frustrating, right? Expect a dedicated, 1:1 relationship where you work with a consistent team that builds a deep understanding of your business, your data, and your Qlik Cloud environment. This eliminates the inefficiencies of multiple touchpoints and the impersonal feel of large, impersonal firms relying on offshore subcontractors who may not have the same level of direct investment in your success. You deserve a team that’s in the trenches with you, not just filling out timesheets from afar.
  • Proactive Partnership, Not Just Order-Taking: A great consulting partner doesn’t just wait for you to tell them what to do. They should be proactive, bringing insights and suggestions to the table based on their understanding of your business and the capabilities of Qlik Cloud. Expect regular check-ins – not just status updates, but strategic conversations about progress, potential roadblocks, and future opportunities. They should be genuinely invested in understanding your specific business goals and tailoring their approach to help you achieve them.

Signs You Might Need a New Qlik Consulting Partner: Don’t Settle for Less.

Are you getting everything you should be from your current data analytics partner? Here are some red flags that might indicate it’s time for a change:
  • Silence is Not Golden: A lack of proactive communication or consistently missed deadlines are clear indicators that your partner isn’t prioritizing your needs. You shouldn’t have to constantly chase them for updates or feel like your project is on the back burner.
  • Quote Chaos: Receiving inaccurate quotes or having to constantly request updated pricing signals a lack of attention to detail and can lead to budget surprises. Transparency in pricing is crucial for building trust.
  • The Price Doesn’t Feel Right: Be wary of price gouging, an unfortunately common practice in technology sales, especially within sectors like state and local government. A trustworthy partner will be upfront and transparent about their pricing and licensing models, ensuring you’re paying a fair market value for the Qlik Cloud products and services. They should be working to get you the best value, not just maximizing their profit at your expense.
  • Where’s the Innovation?: If your partner isn’t bringing innovative solutions or demonstrating a deep understanding of the latest Qlik Cloud features and how they can benefit your specific industry, you might be missing out. A good partner stays ahead of the curve and helps you do the same.
  • They Don’t “Get” Your Business: A partner who doesn’t take the time to understand the unique nuances and challenges of your specific industry is less likely to deliver truly impactful solutions. Generic advice won’t cut it.
  • The Feeling’s Not Mutual: Ultimately, if you feel like your current partner isn’t truly invested in your success, isn’t communicative, or isn’t providing the level of service you expect, it’s a strong sign that it might be time to explore other options.

The Value of Industry Expertise: Why It Matters.

While a broad understanding of data analytics is essential, a partner with proven experience in your specific industry can bring invaluable insights. For example, Arc analytics specializes in education, healthcare, and government sectors, where we’ve developed a deep understanding of the unique data challenges and regulatory landscapes. However, our experience isn’t limited to these verticals. Our history of completing hundreds of projects across several sectors demonstrates our adaptability and ability to apply our Qlik Cloud expertise to diverse business needs. This cross-industry experience allows us to bring best practices and innovative solutions from different fields to your specific challenges.s

Deliverables: Tailored Solutions Designed for You.

Forget the idea of a one-size-fits-all solution. A quality data analytics consulting partner understands that your needs are unique. Expect solutions that are configured and customized to the specific scope of work you define. This means the dashboards, reports, and integrations you receive are designed to answer your specific business questions and track your key performance indicators. While the underlying technology might be consistent, the final deliverables should feel like they were built for you, not just for anyone. We also believe in transparent reporting when it comes to accounting, so you have a clear understanding of the investment you’re making.

Finding the Right Fit: It’s About More Than Just Technology.

Choosing a data analytics consulting partner is a significant decision. It’s about finding a team that not only possesses the technical expertise with Qlik Cloud but also prioritizes clear communication, genuine collaboration, and a deep understanding of your business. You deserve a partner who feels like a natural extension of your own team, dedicated to helping you unlock the full potential of your data.  Ready to explore how a dedicated and experienced Qlik Cloud partner can help you achieve your data analytics goals? We invite you to reach out and discover the difference a true partnership can make. Contact our team.

Connecting to Qlik Sense with Python

Connecting to Qlik Sense with Python

While the Qlik platform has maintained and supported libraries developer libraries in JavaScript and .NET/C# for several years, they have more recently released a library for interacting with Qlik in Python. They call it the Platform SDK, which is also available as a TypeScript library.

The Python library is essentially a set of Python classes and methods that mirror the structures and functions of the Qlik QRS and Engine APIs, also providing some conveniences around authentication and WebSocket connections. The library is open for anyone to download and use thanks to its permissive MIT license.

The use cases for the Qlik Python SDK include being able to write automation scripts for repetitive admin tasks, load app and object data into a Pandas dataframe, and even creating reports built off of app or log data.

Installing the library is very simple — just make sure you are using at least Python 3.8:

python3 -m pip install --upgrade qlik-sdk

Let’s look at some examples of how we can use the library. Below, we import a few classes from the qlik_sdk library and then create some variables to hold our Qlik Cloud tenant URL and API key. We’ll use the API key to authenticate with a bearer token but an OAuth2.0 implementation is also available. Learn how to generate an API key here. The tenant URL and API key are then used to create an Apps object, which provides some high-level methods for interacting with app documents in Qlik Cloud.

from qlik_sdk import Apps, AuthType, Config

# connect to Qlik engine

base_url = "https://your-tenant.us.qlikcloud.com/"
api_key = "xxxxxx"
apps = Apps(Config(host=base_url, auth_type=AuthType.APIKey, api_key=api_key))

Now that we’ve got our authentication situated, let’s add some code to interact with a Qlik app and its contents. First, let’s import a new class, NxPage, which describes a hypercube page (more about Qlik hypercubes here). Then let’s create a new function, get_qlik_obj_data(), to define the steps for getting data from a Qlik object, like a table or bar chart. In this function, we can take an app parameter and an obj_id parameter to open an WebSocket connection to the specified app, get the app layout, get the size of the object’s hypercube, and then fetch the data for that hypercube:

from qlik_sdk.apis.Qix import NxPage

app = apps.get("xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx")

def get_qlik_obj_data(app: NxApp, obj_id: str) -> list:
    """Get data from an object in a Qlik app."""

    # opens a websocket connection against the Engine API and gets the app hypercube

    with app.open():
        tbl_obj = app.get_object(obj_id)
        tbl_layout = tbl_obj.get_layout()
        tbl_size = tbl_layout.qHyperCube.qSize
        tbl_hc = tbl_obj.get_hyper_cube_data(
            "/qHyperCubeDef",
            [NxPage(qHeight=tbl_size.qcy, qWidth=tbl_size.qcx, qLeft=0, qTop=0)],
        )
    
    return tbl_hc


obj_data = get_qlik_obj_data(app=app, obj_id="xxxxxx")

This code would end up returning a list of data pages, something like this:

[NxDataPage(qArea=Rect(qHeight=50, qLeft=0, qTop=0, qWidth=10), qIsReduced=None, qMatrix=[NxCellRows(), NxCellRows(), NxCellRows(), ...]

And if then then peek into one of the NxCellRows contained in the qMatrix property, we’d see an object like this:

NxCell(qAttrDims=None, qAttrExps=None, qElemNumber=29, qFrequency=None, qHighlightRanges=None, qInExtRow=None, qIsEmpty=None, qIsNull=None, qIsOtherCell=None, qIsTotalCell=None, qMiniChart=None, qNum=282, qState='O', qText='282')

The cell value is shown as 282 in the qText property. We may note, though, that we can’t readily identify the field that this value represents.

Let’s add some code to make the resulting dataset include the fields for each cell value. We can do that by adding a get_ordered_cols_qlik_hc() function to get the ordered list of columns in each of these NxCellRows items.

This function will ultimately take a straight hypercube as an argument and do the following:

  • Get the list of dimensions and measures and then combine them into one list.
  • Reorder that list to match the correct column order as defined in the hypercube’s qColumnOrder property.
  • Return that ordered column list.

Then in our get_qlik_obj_data() function, we use our new get_ordered_cols_qlik_hc() function to get our columns. From there we iterate through each row of each data page in the hypercube and create a new dictionary object for each cell and then adding those dictionaries to a list for each row.

New and updated code shown in bold:

from qlik_sdk.apis.Qix import NxPage, HyperCube

    
def get_ordered_cols_qlik_hc(hc: HyperCube) -> list:
    """get ordered columns from Qlik hypercube."""

    # get object columns

    dim_names = [d.qFallbackTitle for d in hc.qDimensionInfo]
    meas_names = [m.qFallbackTitle for m in hc.qMeasureInfo]
    obj_cols = dim_names.copy()
    obj_cols.extend(meas_names)

    # order column array to match hypercube column order

    new_cols = []
    new_col_order = hc.qColumnOrder
    for c in new_col_order:
        new_cols.append(obj_cols[c])
    
    return new_cols



def get_qlik_obj_data(app: NxApp, obj_id: str) -> list:
    """"""

    # opens a websocket connection against the Engine API and gets the app hypercube

    with app.open():
        tbl_obj = app.get_object(obj_id)
        tbl_layout = tbl_obj.get_layout()
        tbl_size = tbl_layout.qHyperCube.qSize
        tbl_hc = tbl_obj.get_hyper_cube_data(
            "/qHyperCubeDef",
            [NxPage(qHeight=tbl_size.qcy, qWidth=tbl_size.qcx, qLeft=0, qTop=0)],
        )


    hc_cols = get_ordered_cols_qlik_hc(tbl_layout.qHyperCube)

    # traverse data pages and store dict for each row

    hc_cols_count = len(hc_cols)
    tbl_data = []

    for data_page in tbl_hc:
        for rows in data_page.qMatrix:
            row = {hc_cols[i]: rows[i].qText for i in range(hc_cols_count)}
            tbl_data.append(row)
    
    return tbl_data


obj_data = get_qlik_obj_data(app=app, obj_id="xxxxxx")

This will get us the desired field: value format that will allow us to better analyze the output, like so:

[
    {'FID': '282', 'Summary Metric': '47', 'Name': 'Sweetwater', ...},
    {'FID': '285', 'Summary Metric': '48', 'Name': 'Sweetwater', ...},
    {'FID': '198', 'Summary Metric': '47', 'Name': 'Vision Drive', ...},
]