Transform Patient Care: Powerful Healthcare Analytics for Results

Transform Patient Care: Powerful Healthcare Analytics for Results

How Data Analytics Improves Patient Outcomes

In today’s healthcare world, data is transforming the way providers deliver care. At Arc Analytics, we see every day how smart use of healthcare analytics leads to better patient outcomes, more efficient operations, and sharper clinical decisions. With the right tools and expertise, healthcare organizations can turn raw data into actionable insights that truly make a difference.

What Is Healthcare Data Analytics?

Healthcare data analytics means using clinical, financial, and operational data to improve care and efficiency. With the rise of electronic health records (EHRs), medical imaging, claims, and patient surveys, healthcare organizations generate more data than ever. When analyzed well, this data reveals patterns that help providers make better decisions and improve patient care. Modern analytics platforms, like Qlik Answers, make it easier for teams to ask complex questions and get clear, actionable answers from their data.

Learn more about our healthcare analytics services.

Types of Analytics in Healthcare

TypeWhat It DoesQlik Solution
DescriptiveLooks at past data to understand what happenedQlik Answers
DiagnosticExplains why certain outcomes occurredQlik Answers
PredictiveForecasts future outcomes based on trendsQlik Predict
PrescriptiveRecommends actions to optimize resultsQlik Automate

Healthcare analytics platforms like Qlik make it possible to move seamlessly from understanding what happened, to why it happened, to what will happen next—and what you should do about it.

How Data Analytics Transforms Patient Care

Early Disease Detection & Risk Prediction

Analytics can spot patients at risk for chronic conditions—like diabetes or heart disease—before symptoms appear. Predictive models, powered by tools such as Qlik Predict, flag high-risk individuals, so providers can act early and prevent complications. This proactive approach helps healthcare teams shift from reactive care to prevention, improving long-term outcomes and reducing costs.

Personalized Treatment Planning

Every patient is unique. By analyzing outcomes from similar cases, providers can tailor treatments to each person’s needs, improving results and reducing side effects. With Qlik Answers, clinicians can quickly compare patient histories and treatment responses, making it easier to design care plans that are truly personalized. This data-driven approach ensures that each patient receives the most effective therapies based on real-world evidence.

Reducing Hospital Readmissions

Unplanned readmissions are costly and often preventable. Analytics helps identify patients likely to return, so care teams can offer extra support, better discharge planning, and follow-up care. Qlik Automate can streamline these processes by triggering automated alerts and workflows for at-risk patients, ensuring that no one falls through the cracks and that interventions happen at the right time.

Case Study: Recovery Center Patient Lifecycle Demo

We recently built a demo app that shows the power of integrated healthcare analytics. This tool connects data from a recovery center—Google Analytics, patient surveys, and clinical records—to map the entire patient journey.

Tracking the Patient Journey

From first contact to discharge, our demo visualizes each step, helping administrators spot bottlenecks and improve care delivery. By integrating multiple data sources, the application provides a 360-degree view of each patient’s experience, making it easier to identify where improvements can be made.

Real-Time Alerts for High-Risk Patients

Using Qlik’s alerting and GeoAnalytics, the app sends real-time notifications when a patient is at risk, even mapping emergency contacts nearby for rapid support. Qlik Automate ensures these alerts are delivered instantly to the right care team members, so action can be taken without delay.

Strengthening Support Networks

By connecting patients with their support networks, facilities can intervene quickly and improve long-term outcomes. The demo leverages Qlik’s advanced mapping and automation features to ensure that support is always within reach, especially during critical moments in a patient’s recovery.

See how our analytics solutions work in action.

Implementing Data Analytics in Healthcare

Building the Right Infrastructure

A strong analytics program needs secure storage, real-time processing, and tools to connect different systems. Qlik’s cloud-based solutions make it easy to scale your analytics infrastructure as your needs grow, while maintaining security and compliance. Explore our data engineering services.

Scalable Governance

We use a scalable governance approach, so your data quality, security, and compliance grow with your analytics capabilities. Qlik’s governance features help ensure that sensitive health data is protected and that analytics remain trustworthy as your organization evolves.

Overcoming Challenges

Data quality, legacy systems, and privacy are real hurdles. Our team helps you plan, integrate, and train for success, leveraging Qlik’s integration and automation tools to simplify even the most complex environments.

The Future: AI & Advanced Analytics

The next wave in healthcare analytics is AI and machine learning—tools that find complex patterns and predict outcomes with new accuracy. With Qlik Predict and open-source platforms, we help organizations prepare for this future, implementing scalable solutions that keep you at the forefront of healthcare innovation.

Why Arc Analytics?

Every healthcare organization is different. We combine technical skill with healthcare know-how to deliver custom analytics that fit your needs and drive real improvements in patient care. Our team works closely with yours to ensure that Qlik’s powerful features—like Qlik Answers, Qlik Predict, and Qlik Automate—are fully leveraged for your unique challenges.

Ready to see what data can do for your patients?


Contact Arc Analytics today to learn how our solutions can help you improve outcomes, boost efficiency, and deliver more personalized care.

Signs Your Organization Needs a Data Consultant Now

Signs Your Organization Needs a Data Consultant Now

When Is It Time to Call a Data Consultant?

Every organization wants to turn data into a competitive edge, but for many in healthcare, education, state government, and small to medium-sized businesses (SMBs), the path isn’t always clear – you’re partnered with a strong data consultant. If you’re struggling to connect the dots between your data and real-world results, you’re not alone. Recognizing the signs early—and knowing where to get help—can make all the difference.

If you’re in healthcare, you might notice that patient data is scattered across EHRs, billing, and departmental systems, making it tough to see the full picture. In education, student information, learning management, and alumni data often live in silos, blocking a unified view of student progress. State governments face similar hurdles with legacy systems and fragmented agency data. SMBs, meanwhile, often collect sales and marketing data but lack the resources to turn it into actionable insights.

Don’t let data challenges hold you back. Explore our data analytics consulting services or contact us to see how a data consultant can help.

Industry-Specific Data Challenges that a Data Consultant Solves

Healthcare

Healthcare organizations often struggle to translate raw data into better patient care. Disconnected systems, inconsistent records, and a lack of clear data strategy can slow progress. If your IT team is overwhelmed or dashboards aren’t delivering the insights you need, it’s time to consider outside expertise. Learn more about our healthcare data analytics services.

Education

Educational institutions face challenges connecting student data to learning outcomes. Siloed systems, unreliable attendance or grading data, and no clear roadmap for data-driven teaching can hinder student success. If your team can’t keep up or you’re not spotting at-risk students early, a data consultant can help. See our education analytics solutions.

State Government

State agencies often find that citizen data isn’t driving better services due to fragmented legacy systems and inconsistent reporting. Without a unified data strategy, it’s hard to inform policy or improve programs. If your IT team is stretched thin or your reports aren’t actionable, it’s time to act. Discover our government analytics expertise.

Small & Medium Businesses (SMBs)

SMBs may collect plenty of data but struggle to make sense of it. Data scattered across apps, duplicate or missing customer info, and limited IT resources can make it hard to compete. If you can’t see sales trends or predict customer needs, you’re missing out on growth opportunities. Check out our data analytics services for SMBs.

Common Data Challenges: At a Glance

ChallengeHealthcareEducationState GovSMBs
Data silos & fragmentation
Poor data quality
No clear data strategy
Overwhelmed IT teams
Weak data visualization/reporting
Not using advanced analytics

What a Data Consultant Brings

A data consultant offers a fresh perspective, specialized industry knowledge, and the technical skills to solve your toughest data problems. They can help you break down silos, improve data quality, and build a roadmap for success. Plus, they empower your team with best practices and keep you up to date with the latest tools and trends.

Ready to take the next step? Meet our team or explore our full range of services.

Conclusion: Invest in Data-Driven Success

Don’t let data challenges slow you down. Whether you’re in healthcare, education, government, or running a growing business, the right consultant can help you unlock the full potential of your data. Contact us today to start your journey toward smarter decisions and better results.

How Qlik Cloud Improves Public Safety Outcomes

How Qlik Cloud Improves Public Safety Outcomes

In the complex and critical realm of public safety, timely and insightful data is the bedrock of effective decision-making. From anticipating potential threats to optimizing emergency responses, the ability to rapidly analyze vast amounts of information can quite literally save lives and improve community well-being. This is where Qlik Cloud Analytics steps in, transforming raw data into actionable intelligence that empowers school systems, law enforcement, emergency services, and community leaders.

At Arc Analytics, we’ve seen firsthand how integrating diverse data sets within Qlik Cloud can create a truly powerful picture of public safety dynamics. To illustrate this, we’ve created a unique demonstration that weaves together seemingly disparate data points, providing an unprecedented level of insight into community safety.

Unveiling Insights: A Multi-Layered Look at Community Safety

This demonstration uses Qlik Cloud to visualize complex public safety scenarios, combining publicly available data from across Florida. Almost every layer of data shown is publicly available, but it also contains critical information for understanding safety concerns. These datasets include:

  • Florida School Grading System (1999-2023): One layer presents the entire state of Florida’s public school grading system from 1999 to 2023. You can see the grades and precise locations of schools across the state, allowing you to gauge educational performance visually.
  • Pinellas County Crime Data (Last 10 Years): Superimposed on this, another layer displays Pinellas County crime data for the last decade. This isn’t just dots on a map; it’s a dynamic heat map that visually represents the severity of crime, indicating when and where incidents occurred. This gives a visceral sense of criminal activity patterns.
  • Pinellas County Sex Offender Locations: Perhaps one of the most impactful layers shows the locations of all registered sex offenders in Pinellas County. What makes this particularly compelling is the interactive element: when you hover your cursor over an offender’s location, their mugshot instantly appears.
  • Pinellas County Bus Routes: A crucial layer reveals the bus routes of Pinellas County. By toggling this on, you can see the lines on the map, allowing for a visual correlation between public transportation arteries and areas with higher crime rates. This insight can be vital for understanding movement patterns and potential vulnerabilities.
  • Pinellas County SNAP Locations: An open data source that shows each eligible location in Florida that accepts SNAP benefits for the program recipient.

The Critical Role of Monitoring Public School Education in Community Safety

Tracking the educational performance of public schools is vital for understanding community well-being and long-term public safety. By monitoring school grades and trends, stakeholders can identify areas needing support, allocate resources more effectively, and address underlying issues that may impact youth outcomes. Data-driven insights into education not only help improve academic achievement but also contribute to safer, more resilient communities by empowering early intervention and informed decision-making.

Twenty years of trending Florida School Accountability Reports

Optimizing Resource Allocation

Public safety agencies often face the challenge of deploying limited resources to cover vast and dynamic areas. Qlik Cloud provides the analytical horsepower to make these critical decisions with precision:

  • Dynamic Deployment Strategies: By integrating crime data, population density, historical incident reports, and even real-time events, Qlik Cloud allows commanders to visualize hotspots and allocate police patrols, fire services, or emergency medical teams more effectively.
  • Staffing Optimization: Analyzing call volumes, response times, and incident types helps agencies determine optimal staffing levels and shift schedules, ensuring adequate coverage where and when it’s most needed.
  • Infrastructure Planning: Understanding the correlation between infrastructure (like bus routes, as seen in the demo) and incident patterns can inform decisions about where to increase surveillance, improve lighting, or adjust public transport schedules to enhance safety.

Proactive Crime Prevention

Moving beyond reactive responses, Qlik Cloud empowers agencies to adopt proactive crime prevention strategies:

  • Pattern Recognition: The ability to layer data like school locations, crime hotspots, and known offender residences helps identify subtle patterns that might indicate elevated risks in certain areas or times. This allows for targeted community engagement or increased presence.
  • Intervention Program Effectiveness: By tracking the locations and characteristics of crime, agencies can evaluate the effectiveness of community programs, youth outreach, or neighborhood watch initiatives. Qlik can show if interventions in specific areas are truly leading to a reduction in incidents.
  • Risk Area Identification: The demo’s ability to show sex offender locations relative to schools and homes is a prime example of how Qlik can highlight vulnerable areas, allowing for informed alerts to school officials, parents, and community members.
Overview of crime data within Pinellas County

Enhancing Emergency Response and Coordination

In emergencies, every second counts. Qlik Cloud facilitates faster, more informed responses:

  • Real-time Situational Awareness: By integrating live feeds from sensors, traffic cameras, and dispatch systems, Qlik Cloud can provide first responders with real-time dashboards showing the evolving situation, allowing for quicker decisions on routes, hazards, and necessary resources.
  • Incident Command Support: During large-scale emergencies, Qlik applications can consolidate information from multiple agencies (police, fire, medical, public works) into a single, intuitive view, enhancing coordination and resource deployment.
  • Post-Incident Analysis: After an event, Qlik Cloud allows for thorough analysis of response times, resource utilization, and outcomes, providing invaluable lessons for future emergency planning and training.

This powerful mapping tool allows users to toggle each layer on and off, revealing a dynamic narrative of public safety. For instance, a school district official or a concerned parent can easily visualize the proximity of a registered sex offender to their nearest school or even their home location. This immediate, visual insight provides a critical understanding of potential risks that text-based reports simply cannot convey.

The real underlying point of this demo, even though it utilizes publicly sourced data to paint a vivid picture, is this: by providing more and more data sources with deeper aspects of knowledge, you can enable better decision-making no matter your position, role, or concern for public safety. It’s about empowering everyone with the insights they need.

Geometry lines of 20-minute walking distance to school, sex offenders (with mugshot), and SNAP Benefit in downtown St Petersburg

The Power of Data Collaboration for a Safer Tomorrow

The true strength of Qlik Cloud in public safety lies not just in its individual analytical capabilities but in its ability to foster data collaboration. Public safety is rarely the responsibility of a single entity. It involves complex interactions between law enforcement, fire departments, emergency medical services, local government, schools, and community organizations.

Qlik Cloud provides a unified, secure platform where these diverse stakeholders can share, visualize, and collaborate on critical data. This breaks down traditional information silos, enabling:

  • Cross-Agency Insights: Police departments can share crime trends with school districts, allowing for joint safety initiatives. Emergency services can share incident data with urban planners to identify areas needing better infrastructure.
  • Informed Community Engagement: By making relevant, aggregated data accessible to the public, as demonstrated by the school performance aspects of this demonstration, it fosters greater community trust and encourages informed citizen participation in safety efforts.
  • Proactive Policy Making: Legislators and city planners can use these comprehensive datasets to inform policy decisions, allocate budgets, and design safer communities based on clear evidence rather than assumptions.

Partnering for Public Safety with Arc Analytics

At Arc Analytics, we are dedicated to helping public safety agencies and communities harness the full potential of Qlik Cloud. This example is just one powerful, actionable insight that can be gleaned when data is integrated and visualized effectively. We understand the sensitive nature of public safety data and ensure that our solutions adhere to the highest standards of security and compliance.

We work closely with organizations to:

  • Integrate Disparate Data Sources: Bringing together information from police records, school systems, public transportation, and other critical databases.
  • Develop Custom Analytical Applications: Building tailored Qlik solutions that address your specific public safety challenges and objectives.
  • Provide Expert Implementation and Training: Ensuring your teams are proficient in using Qlik Cloud to drive continuous improvements in public safety.

By transforming raw data into clear, actionable intelligence, Qlik Cloud Analytics, supported by Arc Analytics’ expertise, empowers public safety professionals and concerned citizens alike to make better decisions, protect communities, and build a safer future for everyone.

Ready to see the power of Qlik Cloud Analytics in action? Schedule a full demo with our team today and discover how data-driven insights can transform public safety in your community.

Adding Tabler Icons to Qlik Dashboards

Adding Tabler Icons to Qlik Dashboards

Qlik doesn’t really allow for icons. When creating dashboards in Qlik Cloud, it can be very helpful to add icons to spruce up KPIs, titles, and tables. There are hundreds of use cases for adding some visual flair using icons, but it can be cumbersome to add icons to objects in Qlik because there are very few built-in icon options.

So, how can we go about adding some icons to our dashboards in an easy and expressive way?

We can use a font! But wait, we’re talking about icons, not text. So how will a font help us?
It turns out that fonts can pack in far more than just standard characters like letters, numbers, and punctuation. One example of a “supercharged” font is Tabler Icons.

Tabler Icons is an open source project that bundles thousands of beautiful icons into multiple formats that you can use freely in your web projects. One such format is a webfont, specifically .tff which is a TrueType font type.

Tabler Icons Website

How can we use this font in Qlik?

We’ll add it to a custom Qlik theme and choose icons in our dashboard using variable expansion with a parameter.

Don’t worry if this doesn’t quite make sense yet! Let’s go through each step now.

Steps to set up

Download Tabler Icon webfont.

  • We can find the tabler-icons.ttf font file in the Tabler Icons Webfont package on the NPM website:

tabler icons webfont

Create a new or open an existing Qlik theme.

  • If you don’t already have a custom theme to add this font to, go ahead and create one based on the instructions laid out on the Qlik Help website. You can also look online for a Qlik theme generator to help get you started.
Add tabler-icons.ttf to the Qlik theme folder.

  • Move the tabler-icons.ttf file to your custom Qlik theme folder. It should look similar to this:
tabler icons webfont
Add @font-face to the theme CSS file.

  • Open your theme’s .css file and add this snippet at the top:
    @font-face {
      font-family: "Tabler Icons";
      src: url("tabler-icons.ttf") format("truetype");
      font-weight: normal;
      font-style: normal;
    }
  • Save and close the file.
Add or modify the fontFamily property in the theme JSON file:

  • Open your theme’s .json file and add this snippet near the top:
    "fontFamily": "Open Sans, Tabler Icons, sans-serif"
  • Here’s an example of what it should look like:
ℹ️ Note that in our screenshot above, my snippet includes the Open Sans font, as I want for that to be the primary font for normal characters like letters and numbers. You can replace that with any of the default Qlik Cloud font options:

Upload file to Qlik Cloud or Qlik Sense Client-Managed.

  • To add your custom theme to Qlik, you must first save the theme folder as a ZIP file.
  • How to upload your theme to Qlik Cloud
    • You can follow this Qlik Help page guide on how to upload your theme ZIP file to Qlik Cloud.
    • After uploading the theme ZIP file, you should see the theme in the Admin Console:
  • How to upload your theme to Client-Managed
  • Open a Qlik app and add this Qlik script to the Data Load Editor in the Main section:
    Set GetTablerIcon = Chr(Num#(‘$1’, ‘(hex)’));
  • Your Main script should look like this:
How to use icons in a dashboard

  • In the box that appears, click on the hex value to copy it:
  • Go to the app Sheet view and switch the app theme to use our uploaded theme.
In your Qlik app, select any Qlik object, and then choose an expression property.

  • For example, you can create or select a bar chart object and then open the expression editor for the Title expression.
In the property’s expression editor, we’ll use dollar-sign expansion with our GetTablerIcon variable and use the our copied Tabler Icon hex code as the parameter.

  • Make this the expression:
    =$(GetTablerIcon(ea59)) & ' Sales by County'
– Then select the Apply button to save that.

You should now see your icon in the chart title!

If your icon doesn’t appear or you see a placeholder character in the chart title where our icon should be, you probably just need to update the font property.

  • To do this, go to the chart styling tab:
  • Find the font property we want to change (Title in this example) and then choose the option that includes Tabler Icons:
ℹ️ Note that if you want to “pair” the Tabler Icons font with a primary font that regular characters will use, refer back to step 5.

Summary

You should now be able to use Tabler Icons anywhere in a Qlik dashboard that supports text expressions and changing the font!

That should get you very far.Try changing the font color and size to see how the icons scale very well and can be recolored just like text.

Building A Robust Data-Driven Culture

Building A Robust Data-Driven Culture

In today’s fiercely competitive business landscape, data has moved beyond the realm of simple record-keeping to become the very engine of strategic advantage. Organizations that can effectively harness the insights hidden within their data streams are demonstrably more agile, innovative, and ultimately, more successful. However, the journey towards becoming a truly data-driven organization is not merely about deploying sophisticated analytics platforms. It requires a fundamental shift in culture, a deep-seated commitment that permeates every level of the organization, from the executive suite to individual contributors. This comprehensive guide will navigate the essential steps involved in cultivating a robust data-driven culture, underscoring its profound benefits and illuminating the critical role of people, processes, and technology in this transformative endeavor.

Laying the Foundation: Identifying Key Pain Points and Opportunities

The initial and foundational stage in building a data-driven culture involves a collaborative and thorough effort to pinpoint the specific areas within the organization where data can exert the most significant positive influence. This process extends beyond simply identifying obvious operational bottlenecks or areas of inefficiency. It necessitates engaging stakeholders from across all departments – sales, marketing, operations, finance, customer service, and beyond – to understand their unique challenges and the questions they struggle to answer with existing information. For instance, the marketing team might grapple with understanding which campaigns yield the highest return on investment, while the sales team might lack clarity on the characteristics of their most successful leads. Operations could be struggling with unpredictable supply chain disruptions, and customer service might be reactive rather than proactively addressing potential issues.

Furthermore, the focus should not solely be on rectifying problems. A truly data-driven mindset actively seeks opportunities where data can fuel innovation, enhance the customer experience in meaningful ways through personalization, optimize the allocation of resources across various initiatives, and even identify entirely new business models. By involving a diverse range of perspectives, organizations can uncover a broader spectrum of both pain points ripe for data-driven solutions and untapped opportunities waiting to be unlocked. Prioritizing these identified areas based on their potential impact on key business objectives and the practical feasibility of implementing data-driven solutions will ensure that initial efforts are strategically aligned and deliver tangible value, fostering early buy-in and demonstrating the power of a data-centric approach.

Empowering Solutions: Leveraging Data to Solve Problems and Drive Innovation

Once the key pain points and promising opportunities have been identified, the next crucial step involves strategically applying various methodologies of data analysis to extract meaningful insights and drive tangible improvements. This encompasses a spectrum of analytical techniques, each suited to answering different types of questions. Descriptive analysis provides a historical overview of what has occurred, offering valuable context. Diagnostic analysis delves deeper, seeking to understand the underlying reasons and correlations behind observed trends. Predictive analysis leverages historical data and statistical modeling to forecast future outcomes and anticipate potential challenges or opportunities. Finally, prescriptive analysis goes beyond prediction by recommending specific actions and interventions to achieve desired results.

For example, if a sales team is struggling with high customer churn, diagnostic analysis might reveal specific customer segments or interaction patterns that are strong indicators of attrition. Predictive modeling could then forecast which current customers are most likely to churn, allowing for proactive intervention. Prescriptive analytics could even recommend targeted strategies, such as personalized offers or enhanced support, to mitigate this risk. Similarly, in product development, analyzing customer feedback data (both structured and unstructured) can provide invaluable insights into unmet needs, guiding the creation of innovative new features or products. The process of leveraging data for problem-solving and innovation is iterative, requiring a willingness to formulate hypotheses, rigorously test them against available data, and refine analytical approaches based on the evidence uncovered. Embracing a culture of experimentation, including A/B testing different data-driven strategies, is essential for validating their effectiveness and fostering a continuous cycle of improvement and learning.

Cultivating Data Fluency: The Cornerstone of a Data-Driven Culture

The successful and sustainable embedding of a data-driven culture within an organization fundamentally relies on cultivating a high degree of data fluency across all levels of its workforce. This does not imply that every employee needs to become a data scientist or possess advanced statistical expertise. Instead, it signifies fostering a widespread comfort level in working with data, enabling individuals to understand basic data concepts, interpret visualizations, formulate relevant questions based on data, and confidently utilize data-backed insights in their daily decision-making processes. The specific levels of data literacy required will naturally vary depending on individual roles and responsibilities. However, a foundational understanding of data privacy, ethical data usage, and the ability to critically evaluate data sources are essential for everyone.

Organizations can adopt a multi-pronged approach to elevate data literacy. This includes implementing comprehensive training programs tailored to different skill levels and roles, creating easily accessible internal resources such as data glossaries, style guides for data interpretation, and case studies showcasing successful data application. Mentorship programs that pair data experts with colleagues seeking to enhance their skills can also be highly effective. A critical element is ensuring that data is presented in an accessible and understandable manner for non-technical users, often through user-friendly dashboards and intuitive data visualization tools that abstract away unnecessary complexity. Leadership plays a pivotal role in championing data literacy initiatives by actively demonstrating the value of data in their own decision-making processes, visibly supporting training efforts, and fostering an environment where asking data-related questions is not only encouraged but expected. Ultimately, nurturing a culture of intellectual curiosity, where employees are empowered to explore data and seek evidence-based answers, will solidify data fluency as a core organizational competency and drive widespread adoption of data-driven practices.

Equipping Your Team: Choosing and Implementing the Right Data Tools

The strategic selection and effective implementation of appropriate data tools are critical enablers of a data-driven culture. The right tools can democratize access to data, empower users to perform their own analyses, and streamline the process of generating insights. When evaluating potential data tools and platforms, organizations should consider several key criteria. Usability for a diverse range of users, regardless of their technical proficiency, is paramount. Seamless integration capabilities with existing systems and data sources are essential to break down silos and ensure data accessibility. Scalability to handle growing data volumes and evolving analytical needs is crucial for long-term viability. Robust security features are non-negotiable to protect sensitive data and ensure compliance with relevant regulations. Finally, the overall cost-effectiveness of the tools, considering both initial investment and ongoing maintenance, must be carefully evaluated.

Platforms like Qlik Cloud offer a powerful and versatile suite of capabilities designed to foster a data-driven environment. Their intuitive and interactive data visualization tools empower users to create insightful dashboards and reports with minimal technical expertise, while their robust data integration features facilitate the connection and harmonization of data from disparate sources. Features such as collaborative analytics enable teams to work together on data exploration and insight generation, and embedded analytics capabilities allow for the seamless integration of data insights into existing applications and workflows. However, simply selecting the right tools is only part of the equation. Successful adoption necessitates a well-planned implementation strategy, comprehensive training programs to ensure users can effectively leverage the tools’ features, and ongoing support to address any technical challenges or user questions. Furthermore, establishing clear data governance policies and procedures is essential to ensure the quality, accuracy, and trustworthiness of the data being utilized within these tools, fostering confidence and driving adoption.

Conclusion: Embracing Data as the Engine of Success

In conclusion, the journey towards building a truly robust and impactful data-driven culture requires a holistic and sustained effort that encompasses people, processes, and technology. By systematically identifying key pain points and opportunities, empowering data-driven solutions, cultivating widespread data fluency across the organization, strategically selecting and implementing the right data tools, and diligently sustaining the momentum through continuous learning and leadership commitment, organizations can transform data from a latent asset into the very engine of their success, driving innovation, enhancing efficiency, fostering deeper customer understanding, and ultimately achieving a significant and sustainable competitive advantage in today’s data-rich world.

What to Expect From a Data Analytics Consulting Partner

What to Expect From a Data Analytics Consulting Partner

Navigating the world of data analytics can feel like trying to decipher an ancient language. You know the potential is there – those hidden insights that can propel your business forward – but unlocking them often requires a skilled guide. That’s where a data analytics consulting partner comes in. But just like choosing the right travel companion, finding the right partner can make all the difference between a smooth journey and a frustrating detour. This isn’t just about someone setting up a dashboard and calling it a day. A true partner becomes an extension of your team, deeply understanding your unique challenges and working collaboratively to achieve your specific goals. So, what should you really expect from this kind of relationship?

What Should You Be Receiving? It’s More Than Just Deliverables.

When you engage a data analytics consulting partner, you’re not just buying a service; you’re investing in expertise and a collaborative relationship. Here are some key things you should expect to receive:
  • Flexibility That Fits Your Needs: Forget rigid contracts and pre-packaged solutions. A good partner understands that your needs can evolve. Expect pre-authorized hours that provide budget control while allowing for necessary work to be completed. Think of it as setting a clear boundary, like saying, “Let’s scope this project within 40 hours, and if we need more, we’ll talk.” This demonstrates respect for your budget and ensures transparency. Furthermore, look for flexible service level agreements (SLAs). These shouldn’t be one-size-fits-all. A partner should be willing to tailor SLAs – perhaps a standard four-hour response time for typical requests, which might even adjust to a quicker 30-minute response during critical periods – all tied to clearly defined scopes of work and agreed-upon hourly rates. This adaptability shows they’re truly invested in supporting your business rhythm.
  • A Consistent and Dedicated Point of Contact: Imagine having to explain your project to a new person every time you reach out. Frustrating, right? Expect a dedicated, 1:1 relationship where you work with a consistent team that builds a deep understanding of your business, your data, and your Qlik Cloud environment. This eliminates the inefficiencies of multiple touchpoints and the impersonal feel of large, impersonal firms relying on offshore subcontractors who may not have the same level of direct investment in your success. You deserve a team that’s in the trenches with you, not just filling out timesheets from afar.
  • Proactive Partnership, Not Just Order-Taking: A great consulting partner doesn’t just wait for you to tell them what to do. They should be proactive, bringing insights and suggestions to the table based on their understanding of your business and the capabilities of Qlik Cloud. Expect regular check-ins – not just status updates, but strategic conversations about progress, potential roadblocks, and future opportunities. They should be genuinely invested in understanding your specific business goals and tailoring their approach to help you achieve them.

Signs You Might Need a New Qlik Consulting Partner: Don’t Settle for Less.

Are you getting everything you should be from your current data analytics partner? Here are some red flags that might indicate it’s time for a change:
  • Silence is Not Golden: A lack of proactive communication or consistently missed deadlines are clear indicators that your partner isn’t prioritizing your needs. You shouldn’t have to constantly chase them for updates or feel like your project is on the back burner.
  • Quote Chaos: Receiving inaccurate quotes or having to constantly request updated pricing signals a lack of attention to detail and can lead to budget surprises. Transparency in pricing is crucial for building trust.
  • The Price Doesn’t Feel Right: Be wary of price gouging, an unfortunately common practice in technology sales, especially within sectors like state and local government. A trustworthy partner will be upfront and transparent about their pricing and licensing models, ensuring you’re paying a fair market value for the Qlik Cloud products and services. They should be working to get you the best value, not just maximizing their profit at your expense.
  • Where’s the Innovation?: If your partner isn’t bringing innovative solutions or demonstrating a deep understanding of the latest Qlik Cloud features and how they can benefit your specific industry, you might be missing out. A good partner stays ahead of the curve and helps you do the same.
  • They Don’t “Get” Your Business: A partner who doesn’t take the time to understand the unique nuances and challenges of your specific industry is less likely to deliver truly impactful solutions. Generic advice won’t cut it.
  • The Feeling’s Not Mutual: Ultimately, if you feel like your current partner isn’t truly invested in your success, isn’t communicative, or isn’t providing the level of service you expect, it’s a strong sign that it might be time to explore other options.

The Value of Industry Expertise: Why It Matters.

While a broad understanding of data analytics is essential, a partner with proven experience in your specific industry can bring invaluable insights. For example, Arc analytics specializes in education, healthcare, and government sectors, where we’ve developed a deep understanding of the unique data challenges and regulatory landscapes. However, our experience isn’t limited to these verticals. Our history of completing hundreds of projects across several sectors demonstrates our adaptability and ability to apply our Qlik Cloud expertise to diverse business needs. This cross-industry experience allows us to bring best practices and innovative solutions from different fields to your specific challenges.s

Deliverables: Tailored Solutions Designed for You.

Forget the idea of a one-size-fits-all solution. A quality data analytics consulting partner understands that your needs are unique. Expect solutions that are configured and customized to the specific scope of work you define. This means the dashboards, reports, and integrations you receive are designed to answer your specific business questions and track your key performance indicators. While the underlying technology might be consistent, the final deliverables should feel like they were built for you, not just for anyone. We also believe in transparent reporting when it comes to accounting, so you have a clear understanding of the investment you’re making.

Finding the Right Fit: It’s About More Than Just Technology.

Choosing a data analytics consulting partner is a significant decision. It’s about finding a team that not only possesses the technical expertise with Qlik Cloud but also prioritizes clear communication, genuine collaboration, and a deep understanding of your business. You deserve a partner who feels like a natural extension of your own team, dedicated to helping you unlock the full potential of your data.  Ready to explore how a dedicated and experienced Qlik Cloud partner can help you achieve your data analytics goals? We invite you to reach out and discover the difference a true partnership can make. Contact our team.

Connecting to Qlik Sense with Python

Connecting to Qlik Sense with Python

While the Qlik platform has maintained and supported libraries developer libraries in JavaScript and .NET/C# for several years, they have more recently released a library for interacting with Qlik in Python. They call it the Platform SDK, which is also available as a TypeScript library.

The Python library is essentially a set of Python classes and methods that mirror the structures and functions of the Qlik QRS and Engine APIs, also providing some conveniences around authentication and WebSocket connections. The library is open for anyone to download and use thanks to its permissive MIT license.

The use cases for the Qlik Python SDK include being able to write automation scripts for repetitive admin tasks, load app and object data into a Pandas dataframe, and even creating reports built off of app or log data.

Installing the library is very simple — just make sure you are using at least Python 3.8:

python3 -m pip install --upgrade qlik-sdk

Let’s look at some examples of how we can use the library. Below, we import a few classes from the qlik_sdk library and then create some variables to hold our Qlik Cloud tenant URL and API key. We’ll use the API key to authenticate with a bearer token but an OAuth2.0 implementation is also available. Learn how to generate an API key here. The tenant URL and API key are then used to create an Apps object, which provides some high-level methods for interacting with app documents in Qlik Cloud.

from qlik_sdk import Apps, AuthType, Config

# connect to Qlik engine

base_url = "https://your-tenant.us.qlikcloud.com/"
api_key = "xxxxxx"
apps = Apps(Config(host=base_url, auth_type=AuthType.APIKey, api_key=api_key))

Now that we’ve got our authentication situated, let’s add some code to interact with a Qlik app and its contents. First, let’s import a new class, NxPage, which describes a hypercube page (more about Qlik hypercubes here). Then let’s create a new function, get_qlik_obj_data(), to define the steps for getting data from a Qlik object, like a table or bar chart. In this function, we can take an app parameter and an obj_id parameter to open an WebSocket connection to the specified app, get the app layout, get the size of the object’s hypercube, and then fetch the data for that hypercube:

from qlik_sdk.apis.Qix import NxPage

app = apps.get("xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx")

def get_qlik_obj_data(app: NxApp, obj_id: str) -> list:
    """Get data from an object in a Qlik app."""

    # opens a websocket connection against the Engine API and gets the app hypercube

    with app.open():
        tbl_obj = app.get_object(obj_id)
        tbl_layout = tbl_obj.get_layout()
        tbl_size = tbl_layout.qHyperCube.qSize
        tbl_hc = tbl_obj.get_hyper_cube_data(
            "/qHyperCubeDef",
            [NxPage(qHeight=tbl_size.qcy, qWidth=tbl_size.qcx, qLeft=0, qTop=0)],
        )
    
    return tbl_hc


obj_data = get_qlik_obj_data(app=app, obj_id="xxxxxx")

This code would end up returning a list of data pages, something like this:

[NxDataPage(qArea=Rect(qHeight=50, qLeft=0, qTop=0, qWidth=10), qIsReduced=None, qMatrix=[NxCellRows(), NxCellRows(), NxCellRows(), ...]

And if then then peek into one of the NxCellRows contained in the qMatrix property, we’d see an object like this:

NxCell(qAttrDims=None, qAttrExps=None, qElemNumber=29, qFrequency=None, qHighlightRanges=None, qInExtRow=None, qIsEmpty=None, qIsNull=None, qIsOtherCell=None, qIsTotalCell=None, qMiniChart=None, qNum=282, qState='O', qText='282')

The cell value is shown as 282 in the qText property. We may note, though, that we can’t readily identify the field that this value represents.

Let’s add some code to make the resulting dataset include the fields for each cell value. We can do that by adding a get_ordered_cols_qlik_hc() function to get the ordered list of columns in each of these NxCellRows items.

This function will ultimately take a straight hypercube as an argument and do the following:

  • Get the list of dimensions and measures and then combine them into one list.
  • Reorder that list to match the correct column order as defined in the hypercube’s qColumnOrder property.
  • Return that ordered column list.

Then in our get_qlik_obj_data() function, we use our new get_ordered_cols_qlik_hc() function to get our columns. From there we iterate through each row of each data page in the hypercube and create a new dictionary object for each cell and then adding those dictionaries to a list for each row.

New and updated code shown in bold:

from qlik_sdk.apis.Qix import NxPage, HyperCube

    
def get_ordered_cols_qlik_hc(hc: HyperCube) -> list:
    """get ordered columns from Qlik hypercube."""

    # get object columns

    dim_names = [d.qFallbackTitle for d in hc.qDimensionInfo]
    meas_names = [m.qFallbackTitle for m in hc.qMeasureInfo]
    obj_cols = dim_names.copy()
    obj_cols.extend(meas_names)

    # order column array to match hypercube column order

    new_cols = []
    new_col_order = hc.qColumnOrder
    for c in new_col_order:
        new_cols.append(obj_cols[c])
    
    return new_cols



def get_qlik_obj_data(app: NxApp, obj_id: str) -> list:
    """"""

    # opens a websocket connection against the Engine API and gets the app hypercube

    with app.open():
        tbl_obj = app.get_object(obj_id)
        tbl_layout = tbl_obj.get_layout()
        tbl_size = tbl_layout.qHyperCube.qSize
        tbl_hc = tbl_obj.get_hyper_cube_data(
            "/qHyperCubeDef",
            [NxPage(qHeight=tbl_size.qcy, qWidth=tbl_size.qcx, qLeft=0, qTop=0)],
        )


    hc_cols = get_ordered_cols_qlik_hc(tbl_layout.qHyperCube)

    # traverse data pages and store dict for each row

    hc_cols_count = len(hc_cols)
    tbl_data = []

    for data_page in tbl_hc:
        for rows in data_page.qMatrix:
            row = {hc_cols[i]: rows[i].qText for i in range(hc_cols_count)}
            tbl_data.append(row)
    
    return tbl_data


obj_data = get_qlik_obj_data(app=app, obj_id="xxxxxx")

This will get us the desired field: value format that will allow us to better analyze the output, like so:

[
    {'FID': '282', 'Summary Metric': '47', 'Name': 'Sweetwater', ...},
    {'FID': '285', 'Summary Metric': '48', 'Name': 'Sweetwater', ...},
    {'FID': '198', 'Summary Metric': '47', 'Name': 'Vision Drive', ...},
]
Interpreting  Formats in a Field When Several Are Possible

Interpreting Formats in a Field When Several Are Possible

One of the toughest aspects of dealing with freeform data is that the input layer may not have proper data validation processes to ensure data cleanliness. This can result in very ugly records, including non-text fields that are riddled with incorrectly formatted values.

Take this example dataset:

[Test Data] table
RecordIDDurationField
100:24:00
200:22:56
300:54
40:30
501
64
72:44
85 MINUTES
96/19

Those values in the [DurationField] column are all different! How would we be able to consistently interpret this field as having a Interval data type?

One of the ways you might be inclined to handle something like this is to use If() statements. Let’s see an example of that now.

[New Data]:
Load
    [DurationField]
  , Interval( If(IsNum( Interval#([DurationField], 'hh:mm:ss') )
      , Interval#([DurationField], 'hh:mm:ss')
      , If(IsNum( Interval#([DurationField], 'mm:ss') )
          , Interval#([DurationField], 'mm:ss')
          , If(IsNum( Interval#([DurationField], 'mm') )
              , Interval#([DurationField], 'mm')
              , If(IsNum( Interval#(Left([DurationField], Index([DurationField], 'MIN')-2), 'm') )
                  , Interval#(Left([DurationField], Index([DurationField], 'MIN')-2), 'm')
                  , If(IsNum( Interval#(Replace([DurationField], '/', ':'), 'm:s') )
                      , Interval#(Replace([DurationField], '/', ':'), 'm:s')
                      , Null()
      ))))), 'hh:mm:ss')  as [New DurationField]
;
Load
    Upper(Trim([DurationField])) as [DurationField]
Resident [Test Data];

It’s a mess! Qlik has to evaluate each Interval#() function twice in order to, first, check to see if the value was properly interpreted as a duration (“interval”) value, and then, second, to actually return the interpreted duration value itself.

One of the nice alternative ways of handling this is to use a different conditional function, like Alt(). This function achieves the same thing as using the If() and IsNum() functions in conjunction. You can use:

Alt(arg1, arg2, arg3)

…Instead of:

If(IsNum(arg1), arg1, If(IsNum(arg2), arg2, If(IsNum(arg3, arg3))))

Let’s see how that may look using our previous example data:

[New Data]:
Load
    [DurationField]
  , Interval(Alt(
        Interval#([DurationField], 'hh:mm:ss')
      , Interval#([DurationField], 'mm:ss')
      , Interval#([DurationField], 'mm')
      , Interval#(Left([DurationField], Index([DurationField], 'MIN')-2), 'm')
      , Interval#(Replace([DurationField], '/', ':'), 'm:s')
      , Null()
    ), 'hh:mm:ss') as [New DurationField]
;
Load
    Upper(Trim([DurationField])) as [DurationField]
Resident [Test Data];

Basically what’s happening there is:

  • The preceding load happening at the bottom of that script is there to do some basic standardization of the [DurationField] field so that it’s easier to pattern-match.
  • In the rest of the script, we’re using the Alt() function (Qlik Help page) to check whether its arguments are numeric type of not. Each of its arguments are Interval#() functions, which are trying to interpret the values of the [DurationField] field as the provided format, like 'hh:mm:ss' or 'm:s'.
  • So it’s basically saying:

If Interval#([DurationField], 'hh:mm:ss') returns a value interpreted as an Interval, then return that value (for example, 00:24:00). But if a value couldn’t be interpreted as an Interval (like 5 mins for example, where the Interval#() function would return a text type), we go to the next Interval#() function. If Interval#([DurationField], 'mm:ss') returns a value…

This should all result in a table that looks like this:

Success! 🎉

Using Geo Functions to Aggregate Coordinates

Using Geo Functions to Aggregate Coordinates

In this post, I want to look at how to use a few of the built-in Qlik GeoAnalytics functions that will allow us to manipulate and aggregate geographic data.

Specifically, we are going to look at how to calculate a bounding box for several grouped geographic points, reformat the result, and then calculate the centroid of those bounding boxes. This can be a useful transformation step when our data has geographic coordinates that you need to have aggregated into a single, centered point for a particular grouping.

In our example, we have a small dataset with a few records pertaining to Florida locations. It includes coordinates for each Zip Code that is within the city of Brandon. Our goal is to take those four coordinates, aggregate them into a single, centered point, and then return that point in the correct format for displaying it in a Qlik map object.

Here’s our data, loaded from an Inline table:

[Data]:
Load * Inline [
State   ,   County          ,   City            ,   Zip     ,   Lat         ,   Long
FL      ,   Hillsborough    ,   Apollo Beach    ,   33572   ,   27.770687   ,   -82.399753
FL      ,   Hillsborough    ,   Brandon         ,   33508   ,   27.893594   ,   -82.273524
FL      ,   Hillsborough    ,   Brandon         ,   33509   ,   27.934039   ,   -82.302518
FL      ,   Hillsborough    ,   Brandon         ,   33510   ,   27.955670   ,   -82.300662
FL      ,   Hillsborough    ,   Brandon         ,   33511   ,   27.909390   ,   -82.292292
FL      ,   Hillsborough    ,   Sun City        ,   33586   ,   27.674490   ,   -82.480954
];

Let’s see what happens when we load this data and create a new map that has a point layer, using City as the dimension and the Lat/Long fields as the location fields:

What we may notice here is that the city of Brandon does not show up on the map — this is because the dimensional values for the point layer need to have only one possible location (in this case, one lat/long pair). Since Brandon has multiple Lat/Long pairs (one for each Zip Code), the map can’t display a single point for Brandon.

Okay, so let’s get the bounding box so that we can use it to get the center-most point. This is ultimately what we want our bounding box to be:

To do this in Qlik we’ll use the GeoBoundingBox() function, which calculates the smallest possible box that contains all given points, as shown in the example image above.

Here’s the script we can use in the Data Load Editor:

[Bounding Boxes]:
Load
    [City]
  , GeoBoundingBox('[' & Lat & ',' & Long & ']') as Box
Resident [Data]
  Group By [City]
;

That results in this:

CityBox
Apollo Beach{“qTop”:-82.399753,”qLeft”:27.770687,”qBottom”:-82.399753,”qRight”:27.770687}
Brandon{“qTop”:-82.273524,”qLeft”:27.893594,”qBottom”:-82.302518,”qRight”:27.95567}
Sun City{“qTop”:-82.480954,”qLeft”:27.67449,”qBottom”:-82.480954,”qRight”:27.67449}

Alright so we now have our bounding boxes for our cities, but we can’t use those points quite yet — right now we just have the top, left, right, and bottom points separately:

What we need to do is reformat those points into actual coordinates for the bounding box, like so:

We can achieve this by using the JsonGet() function, which can return values for specific properties of a valid JSON string. This is useful to us because the GeoBoundingBox() function we used before returns the top, left, right, and bottom points in a JSON-like string that we can easily parse for this step.

Here’s the Qlik script we can use to parse those points into actual coordinates:

[Formatted Box]:
Load
    [City]
  , [Box]
  , '[['
    & JsonGet([Box], '/qTop') & ',' & JsonGet([Box], '/qLeft')
    & '],[' & JsonGet([Box], '/qBottom') & ',' & JsonGet([Box], '/qLeft') 
    & '],[' & JsonGet([Box], '/qBottom') & ',' & JsonGet([Box], '/qRight')
    & '],[' & JsonGet([Box], '/qTop') & ',' & JsonGet([Box], '/qRight')
    & '],[' & JsonGet([Box], '/qTop') & ',' & JsonGet([Box], '/qLeft')
    & ']]' as [Box Formatted]
Resident [Bounding Boxes];

Drop Table [Bounding Boxes];

This results in correctly formatted bounding box coordinates:

CityBox Formatted
Apollo Beach[[-82.399753,27.770687],[-82.399753,27.770687],[-82.399753,27.770687],[-82.399753,27.770687],[-82.399753,27.770687]]
Brandon[[-82.273524,27.893594],[-82.302518,27.893594],[-82.302518,27.95567],[-82.273524,27.95567],[-82.273524,27.893594]]
Sun City[[-82.480954,27.67449],[-82.480954,27.67449],[-82.480954,27.67449],[-82.480954,27.67449],[-82.480954,27.67449]]

So now that we have these coordinates, we can aggregate the box coordinates into a center point using the GeoGetPolygonCenter() function, which will take the given area and output a centered point coordinate.

Here’s the script we can use for this:

[Centered Placenames]:
Load *
  , KeepChar(SubField([City Centroid], ',', 1), '0123456789.-') as [City Centroid Long]
  , KeepChar(SubField([City Centroid], ',', 2), '0123456789.-') as [City Centroid Lat]
;
Load
    [City]
  , GeoGetPolygonCenter([Box Formatted]) as [City Centroid]
Resident [Formatted Box];

Drop Table [Formatted Box];

This will result in the center points for each city. We also split out the Lat/Long fields into separate fields for easier use in the map:

CityCity CentroidCity Centroid LatCity Centroid Longitude
Apollo Beach[-82.399753,27.770687]27.770687-82.399753
Brandon[-82.288021,27.9094739069767]27.9094739069767-82.288021
Sun City[-82.480954,27.67449]27.67449-82.480954

And now we can view our city-centered points on a map:

And there we have it! It’s not the perfect centering we may have expected but that could be due to the map projection that we’re using or the specificity of the coordinates we chose. Either way, this is a great way to be able to aggregate several coordinates down to their center point.

JavaScript Bookmarklets that supercharge Qlik Sense

JavaScript Bookmarklets that supercharge Qlik Sense

One of the most-utilized features of web browsers is the bookmark; everyone has their favorite sites saved for later, but that’s really their only function – to navigate you. What if you wanted to have a bookmark that, instead of simply taking you to another webpage, could dynamically make changes to the page you’re currently on? You may be thinking, “oh, you mean browser extensions?” Yes, those absolutely fall into that category of functionality and purpose, but here’s a special type of bookmark you may not have known about: the JavaScript bookmarklet.

A JavaScript bookmarklet is a browser bookmark that runs JavaScript code instead of just navigating you to a webpage. They start with the javascript: quasi-protocol rather than the usual http:// or https:// protocol that we’re used to seeing. These bookmarkets are created the same way you create regular bookmarks and also live in your bookmark bar or folders.

I’ve written a few bookmarklets to make a couple of repetitive or annoying Qlik tasks easier. Let’s look at one of my favorites.

Opening an app from the QMC

If you’re a Qlik Sense on Windows poweruser like me, then you live both in the QMC and the Hub. For tasks that involve me duplicating other developers’ apps, finding and opening generically-named apps, or opening apps where I have access to the app but not the stream, the usual way I’ve accessed those apps is to just copy/paste an already-opened app’s URL and then copy/paste in the new app’s AppID.

Enter this handy little bookmarklet: all you have to do is select a table row in the Apps section of the QMC and then click on the bookmark and it will automatically open the selected app in a new tab!

Below is the full, annotated code for this bookmark, as well as a compressed, shortened version:

How do I make this a bookmark in my browser?

Below are the steps for adding this as a bookmark in your browser — note that I am using the Brave browser, which has similar settings as Google Chrome and Microsoft Edge. If the instructions don’t match the browser that you’re using, do a quick web search on how to add a bookmark to your browser. You should be able to pick it up at step 3 below.

  1. Select your browser’s main pane, find the Bookmarks option, and then select the Bookmark manager option.
  1. Find the menu button and select the Add new bookmark option.
  1. Name the bookmark Open app from QMC.
  2. In the URL field, type in javascript:.
  3. Go to the second code chunk from above (the one with only one line), select all of the text, copy it, and then paste it into the URL field next to where you typed javascript: in the bookmark page.

After you hit the Save button shown in the above GIF, your new bookmarket should be ready to use!