50 Things I Believe About Analytics
This post was originally posted on LinkedIn .
In no particular order:
-
Spreadsheets should never be bottom-aligned.
-
Weekly reports should have a maximum latency of one day.
-
Monthly reports should have a maximum latency of one week.
-
Dashboards should be one page/screen.
-
The data-pixel ratio should be ruthlessly maximized.
-
“Key business questions” are insidious.
-
Every analysis should start with a clearly articulated hypothesis.
-
KPIs are just metrics if they do not have a target set.
-
Benchmarks are not targets.
-
KPI targets can be developed in the absence of historical benchmarks.
-
All data is incomplete.
-
All data is inaccurate.
-
The length of the question has no relationship to the analytics effort required to answer it.
-
Dashboards should not include “insights” or “recommendations.”
-
“Insights” is a dirty word.
-
Use of the phrase “actionable insights” should be a finable offense.
-
It’s okay to say, “I don’t know (yet)” to a stakeholder.
-
Dates should be in rows in the data; not across columns.
-
Pie charts almost always add unnecessary cognitive load. Doughnut charts are just as bad as pie charts.
-
Waterfall charts are amazing…in situations where they are appropriate.
-
In-cell bar charts are amazing…unless terribly implemented by a digital analytics platform.
-
Heatmaps are amazing…as long as they don’t use a red-to-green palette.
-
Learning to program with data (R or Python) is worth the investment.
-
Fluency with VLOOKUP, INDEX, MATCH, pivot tables, and named ranges is a must.
-
Many analytics and BI platforms are shockingly rigid and terrible at data visualization.
-
No report or analysis should be distributed without some form of QA.
-
Most detected anomalies in the data should not be investigated.
-
Any visualization that would not work if printed grayscale is problematic.
-
If a recurring report requires manually updating chart references, the analyst is doing it wrong.
-
No data set is as straightforward as it is initially believed to be.
-
The unplanned destruction/loss of a computer should result in the loss of less than 2 hours of analytics work.
-
The process and methods used in any analysis should be documented while the analysis is being done.
-
Impressions are not awareness.
-
Awareness is measurable (but there is a cost).
-
Too many organizations expect new technology to solve people and process gaps.
-
All analytics implementations are flawed.
-
Most analytics implementations are good enough to get value from in their current state.
-
Machine learning and AI can deliver answers, but they are terrible at delivering good questions.
-
The physics of the internet are immutable.
-
Digital analytics data collection is built on a hack of technology/standards intended for other uses.
-
Media analytics is built on a hack of technology/standards intended for other uses.
-
“How would I feel if this was on the front page of the NY Times” is a good litmus test for deciding what to track and how.
-
Sparklines are a powerful way to provide meaningful context for a metric in a compact space.
-
If the result of analysis is incredibly surprising, there is probably a flaw in the analysis.
-
Gridlines should be turned off in spreadsheets.
-
The quality of the design of an analytics deliverable impacts the credibility with which it is received.
-
All recurring reporting should be fully automated (to the extent possible).
-
Delivery of ad hoc analyses should not be set to a fixed recurring schedule.
-
If the same analysis is repeated on a recurring schedule, it is not an analysis.
-
Analytics cannot replace creative thought.
Bonus #51: Analysts should understand the realities of martech.