The Unfulfilled Promise of Online Analytics, Part 1
Man is the symbol using animal,
Inventor of the negative, separated from his natural condition
By instruments of his own making
Goaded by the spirit of Hierarchy,
With knowledge of his own mortality
And rotten with perfection.
- Kenneth Burke
I've had this theory that thermodynamic principles could be used to predict user attitudes and behaviors in finite populations for a while. A population threshold has to be reached before accuracy could be achieved. One prediction of the theory is that once that population threshold has been reached, the largest segment of that user population will be unsatisfied users for any given product or service. You don't need to sample the entire population or even the threshold. Another fallout from the theory is that you can create an exemplar group, study that, and make extremely accurate predictions about the entire population (including segments of the population not represented by the exemplar). I've been studying population dynamics for different industries for a while and had an opportunity to study the online analytics community, some results of that are shared here.
I initiated the study by sending the following (or at least a similar) request to people in the online analytics community.
would you be willing to write me your thoughts on “the unfulfilled promise of web analytics/search”? I'm preparing a column/blog post. Your response will be kept confidential (I'm keeping everybody's responses confidential).
My request was intentionally open ended (surprise!). I wanted to know their responses, not what might be predicated by any guidance on my part.
One respondent wrote, “In a survey, this question would be tagged with a 'leader bias'.” I pointed out that as this particular respondent opened their response with “The question was 'why is it that web analytics isn't delivering on its promise'?” they demonstrated that they were quite willing to follow any leader bias that may have existed. The fact that they rephrased my original request is a demonstration that the bias — hence prejudices, acceptances and beliefs — existed long before my request was made.
The question isn't whether or not leader bias exists, the question is “where were respondents willing to be lead?” and is typical of my (and NextStage's) use of the Chinese General Solicitation. Knowing what someone responds isn't as actionable as knowing how they respond (you're shocked I'd offer that, yes?).
I also didn't ask for responses from people who had previously demonstrated (via other writings, etc) they had a “company line” or brand to protect.
The title of this post was originally “The Unfulfilled Promise of Web Analytics” and came from a conversation I had in early June '09. I was talking with some folks, one of whom was an SVP of web analytics and marketing for an international marketing company. Unprompted, this individual shared their disillusion with the web analytics field and provided the title phrase.
In all I received some 60 responses, some from people “just doing their job” to others with national and in some cases international reputations to protect. The responses came from everywhere except South America, Africa and East Asia (I hope to cast a better net in the future). I kept the original spellings and grammar of respondents when I quote them (AmerEnglish is not the native language of several of them) because doing so keeps their intent clear over my own.
I've studied the “largest user group will be 'unsatisfied' users” phenomenon across industries and (so far) it holds true. No doubt I'll write a formal research paper (and include an extensive bibliography) about this phenomenon in my copious free time someday.
In the meantime, allow me to share the results specific to the online analytics industry with you.
The quote on the right was made to me during a discussion. It was offered jokingly and I accept it as such. I also know a little about how the mind works and where such statements — even as jokes — come from.
The person making that comment went on to tell me about a recent conference they attended. At some point a bunch of attendees got into a cab to go out to dinner. One of them offered that companies wanted more and more accountability in their analytics.
The universal response was “What? Accountability? It's time to get into another business.”
Such responses are understandable and they can only be made by people at or near the top of their industry. Nobody wants to work and everybody wants to play. The more fun (play) they can have in their job the more they'll enjoy it. Being accountable isn't fun, though.
And such responses must be put next to “When Web Analytics came around, my first thought was 'cool, plenty of data'. Little did I know data would replace the actual business reflection that spun all of this.”
I recognized true schisms in the responses. I'll mention one here because it relates to the first quote above (I'll get to the other schisms further on). It deals with people experiencing non-conscious incongruities between their identity-core and their identity-personality (more colloquially, The Impostor Syndrome, feeling they're frauds. Personality, Identity and Core make up an individual's psychological self-concept. Different disciplines have different terms for these elements). I would have thought that such sentiment would be prevalent at the lower end of the disciplinary spectrum and it wasn't. More than one well recognized individual shared that they feel like the emperor without any clothes when challenged about their conclusions. They often want to respond as is indicated above; “See this tool? I must know what I'm doing because I use this tool.”
The schism here was more psychological and psycho-social than analytics wise. Did these individuals have confidence in their analysis? Most often, yes. Did they have confidence their analysis would be accepted/have meaning/provide value?
Sensing or believing that one's work is not honored or respected is damning. Such attitudes are psychological death to the vulnerable and emotionally uncomfortable to the strong.
“…the question of accuracy did not shake off easily. To be totally honest, I kept this to myself.”
Online Analytics is a numerical discipline. That's its whole point; here are numbers that prove something. It is not a psycho-social discipline or, as one respondent wrote, “Maybe new fields need to emerge — web psycho-analytics perhaps?”
Such fields may exist and may emerge if they don't exist already. What is true about them — if they're primarily left to the current online analytics paradigm — is that they will require large numbers to demonstrate accuracy. The accurate metricizing of any social system (the internet as an information-exchange is such a system) requires threshold numbers for accuracy to be demonstrated when traditional methods are used. For example, a data space of 50,000 people within 2 days is reasonable for traditional analytics methods to prevail (use of conditional change models can shrink these numbers considerably). Typical numerical methods involving smaller populations require either longer timeframes or smaller environments to demonstrate reliable, repeatable business value.
“Web psycho-analytics” requires different numerical methods and mathematical paradigms from traditional analytics to demonstrate reliable, repeatable business value.
The above is especially true when individualization — the ability to recognize a visitor as neuro-socio-psychologically unique from all other visitors — is to occur.
However, until such methods are widely adopted clients and consultants are left with
- conflicting numbers from different tools (“…your analytics will not match the vendor's numbers. If you add two or three analytics systems, the numbers will not match each other. This creates situations where it is impossible to reconcile any data sets.”),
- conflicting numbers from the same tool (“Even using the same tool depending of how it is set-up it can lead to very different numbers.”),
- tools that are difficult to use (“And those vendors said it was really, really easy! Pfff! Liars!”),
- conflicting vendor definitions (“Vendors have different standards, meaning that what one vendor considers a visit is not the same as another vendor, thus making comparisons is often misleading.”) and
- unachievable expectations (“Web Analytics is often sold as the thing that will improve your website results by 100-200%, well that's not true.”)
“…talk to other people about what you were trying to accomplish and beg for them to play along.”
Jim Sterne asked me what I'd learned about web analysts a while back at an eM SF. I was onstage at the time. My statements have (I believe) proved cassandric. I offered that there was discontent bordering on malcontent. There was little to no job satisfaction and advised the eM staff to start shifting their conference focus from pure WA to cross disciplinary offerings. I've also openly stated that I left the WAA because it had all the hallmarks of a society in decline (think Rome, Persia, the USSR, …, all overthrown by invaders from without or within).
This comment was made a few years back and I have no knowledge of the WAA as it exists now.
Where does this discontent among online analysts come from?
One place is unaccepted accuracy (as mentioned above). And if the accuracy is accepted, it isn't acted upon. But there are a lot of hindrances to accuracy that are beyond the analyst's control.
Tagging seems to be a major issue in this area. Tagging was originally considered a solution to the accuracy problem. But the world works in balance — especially when unnatural processes are assembled together. Tagging solved accuracy issues but required more sophistication in the collection and analysis of the data. This sophistication required the involvement of other organizational players, some of whom couldn't or wouldn't play along. The end result is that tagging — a relatively simple concept and method — still does not have an industry standard.
The tagging problem (I believe) would go away if clients — not vendors and not consultants — were invited to find common ground in what they're looking for (more on this later). At present clients have no fixed, pervasive idea of what advantage online analytics provides. It doesn't reduce costs, reducing costs is done by rethinking processes. Instead of rethinking their internal processes companies “…have become very lazy.” When the website isn't producing what they believe it should produce the solution is to get another tool.
But there is no magic bullet. Companies who go from one tool to the next are like psychotherapy patients who stay in therapy with no desire to get well. One respondent confided “So long as I show that I'm doing something I'm not responsible if nothing useful gets done.”
Business politics can not be ignored when considering the unfulfilled promise of online analytics. “Where one person has all the authority and all the ability to change a site as they see fit: optimization actually really works. A new headline here, a picture of somebody looking into the camera there. Demand increases, everybody's happy. But, for most corporations, this is not the case” was a sentiment stated often if not as eloquently by many respondents.
“If only I had this report, as shown into the vendors slick presentations…”
One of the problems that came through the responses was that online analysts often demonstrate a victim mentality. This was greatly the case in web analytics and often in search. When asked directly, “If you know these problems exist in your industry why don't you take steps to solve them?” responses tended to manifest feelings and verbiage of powerlessness. The phrases “We can't do anything to solve this.”, “It's out of our hands.”, “It's beyond our control.” and “We don't have access to those people.” were repeated on both sides of the Atlantic. One respondent offered “…is it in the realm of a little web analyst within a large multinational to actually do that?”
Consulting online analysts are caught between the vendors and the clients they serve. If not a victim mentality, this “serving two masters” creates a psychology that's very close. It also ties back to Why hasn't Marketing caught on as a “Science”? and Matching Marketing and IT Mythologies about analysts and marketers finding common ground.
Consensus points abounded on these research elements. Vendors were viewed
- as only being interested in selling licenses,
- as promising more than could be delivered (“the space has been high jacked by vendors who promise a mountain of diamonds without much effort. This is not true.”),
- as not offering proper or worthwhile optimization tools and methods (“…the important thing is the optimization that is done afterwards.”) or
- as offering wolves in sheep's clothing — tools that actually produced simplistic results, could not do deep analysis and therefore produced skepticism about the underlying data.
It is likely that as more and more accountability is demanded from different organizational groups measurement efforts will merge and (perhaps) result in easier corporate buy-in. What may not go down well is that these efforts are more likely to come from marketing than from analytics. Multi-channel marketing will need to learn from online analytics if it is to have value to any business.
“There are simply not enough employees in the companies focusing on adopting web analytics in the organization.”
A challenge that falls out of the above section and the above quote is truer for web analysts than their search-based compatriots in any given organization. Web analysts have fewer champions at the top of the corporate ladders than do marketers and search (which is often not considered an analytics discipline even though the science of search has been documented elsewhere). Marketers have traditionally been closer to the top of the corporate recognition ladder than analysts could ever be. This goes back to the opening statements about work versus play; marketers play, analysts work. This is demonstrated in language if not in physical reality, and one needs to recognize that perception is reality.
Online analytics grew out of (and could very well still be mired in) IT departments. Worse, any kind of analytics smells of accountants (hence accountability), and everybody knows the accountants only come in when the business has failed or is recognizably close to. One respondent wrote that they knew their company was in trouble when the bank sent accountants in (evidently the waves of layoffs and learning they were US$20M in the hole weren't warnings enough).
Online analytics is a discipline of numbers. Whenever there's a discipline of numbers it means there's an evidentiary trail for decisions. Consider the political and psycho-economic meaning of this for a moment.
If I have the option of taking advice from someone who goes with their gut then I really can't be held accountable because there are no numbers, therefore from any evidentiary standpoint I'm pretty safe. Should things go sour it's a political issue because there was no real evidence that we should have gone pro or con, we went with our guts, flipped a coin and took what came.
Even better, it was (point finger in some general direction) their gut feeling, we went with it, it flopped, it was their gut not mine, they're out and I'm still good.
But if I go with hard numbers and my decision is in error? Now it's psycho-economic and I'm the idiot and fool because I didn't understand what I was doing. Both I and the group that helped me make the decision are forfeit.
So which is politically safer to place higher on the corporate ladder, to listen to and feel good about? But even at the top of the corporate ladder guts and numbers are in conflict; the average CMO corporate lifespan is about two years, often less.
“The tools promise a lot, and can live up to most of it.”
Most psychotherapists would look at the responses and recognize a love-hate relationship in the making if not already extant.
However, the love-hate relationship doesn't take the form most psychotherapists are familiar with. Most love-hate relationships exist between an individual and some one thing external to that individual (another person, another thing). The love-hate continuum usually takes the form of “I can't live with (it) and I can't live without (it)”.
This isn't the case with analysts. Most of those surveyed liked what they do and believe they add value for their efforts. They love what they do, just not who they do it for or how it is done (“It's not so much the unrealized promise of web analytics, as organizational politics leading to weak and vaguely defined goals in larger organizations.”). This creates a triangulism and triangulums are always psychologically deadly.
An example of psychological triangulism is the parent who loves their partner and recognizes their partner has an unhealthy relationship with their common child. Parent 1 is caught between protecting the child from the partner and protecting the partner from the eventual wrath of the child (think Oedipus and Electra). Their loyalties are constantly divided (as mentioned above and especially if no psychological reward manifests itself). The psychological challenge escalates until Parent 1 finds themselves developing their own animosity towards the child. They mistakenly believe if the child were not present the parent-partner relationship would be better.
The end result is that both child and parent-partner relationship suffer. Here the analyst-client relationship and the online analytics industry is suffering.
This tension is manifesting in the industry in the same way it manifests in the therapist's office — fingerpointing. Consider the following responses, some obviously from consultants, others from vendors, and those where the lines blur greatly:
- “What's sure is that when it comes to Web Analytics and vendors, there's just one number that counts: quarterly sales. When it comes to clients, well they don't really know which number they're looking for but know it's damn hard & expensive to get it. Those in between, the expensive consultants, they're just trying to make a living and fight for peace on earth and accountable decision making.”
- “Log file data and Web analytics are both sources of information. They are tools, like a hammer. A hammer in the hands of an unskilled, ignorant but self-righteous and overly confident carpenter? That is a scary thought. Well, it is equally as scary to me about Web analytics and log file data. There are plenty of unskilled, ignorant, self-righteous, and overly confident search engine optimization (SEO) professionals, Web analysts, and other marketing people. Even many search engine software engineers are not competent carpenters or architects, but they honestly believe they are. And we are buying what they have to sell.”
- “I honestly don't think there are unfulfilled promises of web analytics. The companies are doing great and the software is progressing all the time. I love analytics!”
- “Why is it so hard for people in the web to take actions and optimize based on what the tool reports? One of the reasons of this is that often they don't have a clue of what can be changed or they have an idea which is incorrect.”
- “If web analytics is under-delivering in any way, it is largely because of most organizations inability to address web analytics at the strategic level rather than a tactical tool to optimize the online marketing channel.”
- “I think the promise is fulfilled for some and not for others! The difference is the level of sophistication of the user. For some companies, even if they deploy it properly there is more volume and nuance to the data than they can properly grok.”
Is more training the answer? And if so, who do we train?
I'll restate here what I wrote in Learning to Use New Tools; the use of any tool is going to require training across the usage spectrum. The use of new tools definitely so. This training can be self-training and the user should be prepared for scraped knuckles, smashed thumbs and lots of cursing. Self-training is great when the user has lots of time and patience. Otherwise, take a class or let the experts (“consultants”) in.
Do remember Buckminster Fuller's definition — An expert is someone who can spit over a boxcar. I often tell people that the front of my shirt is soaked based on my failed efforts.
More training is the answer only if the training results in well-reasoned and understandable business actions. Tools and trainings are worthless without knowing what one wants to build (“It reminds me of the development of web sites themselves ten years ago – everybody had to have one, still not being absolutely sure what to use them for. Of course the free tools have done their part in this evolution.”).
“The unfulfilled promise of web analytics and search is measuring outcomes instead of outputs.”
Our culture (western, not analytics) has been “objective and evidence driven” for about 400 years. There has been the unstated Field of Dreams-like belief that “If you have the numbers, the truth will come”.
I believe most of the analysts surveyed would consider this a desirable yet inaccurate depiction of the real world. Their tools produce “…beautiful charts that don't tell me what to do to make things different – not better, just different. For that I have to go somewhere else.” None of the analysts surveyed wrote or talked about growth curves, forward discounting, debts, rates of depreciation, technological obsolescence, energy consumption (the company that can correctly respond to market needs faster wins because a) it responded correctly and b) it required less energy to do so).
This greatly surprised me. For all the analysts “in the room”, none talked of analysis. Several responses demonstrated a level of contempt regarding available tools (vendor agnostic) so it's possible analysis per se isn't a subject of high regard in its own community.
The above presents a discomfiting scenario. It demonstrates a severe disconnect between “what should be” and “what is”, something in keeping with C.P. Snow's two cultures yet far more pervasive (in this industry) therefore far more damaging.
If this paper focuses more on psychologies than on analytics it's because the responses dictated it so.
“…analysis is a story based upon data put into context.”
The quote starting this section was very telling but not unique. No respondents believed the numbers alone proved anything, nor even when presented as part of a strategy. And few respondents seemed to be equally comfortable in boardrooms as in spreadsheets.
Yet the need for online analytics to be part of a larger picture, a grander story, was everywhere. Analysts uniformly perceive themselves as
- not part of a unified business reporting structure,
- not contributing to the big picture, and
- lacking the political power or psycho-social maturity (within the organization) to sit at the grownups' table.
“And then there's this vague notion from Mr. Kaushik…”
Let me emphasize that I did not choose the exemplars noted in this paper. Respondents demonstrated exemplar recognition in conversation and written material.
Any pervasive duality will present itself in exemplars (not to be confused with my previous mention of exemplars as part of this research). Here the exemplars (or probably more accurately, “doyas”) are Avinash Kaushik and Eric Peterson with Avinash Kaushik leading the pack in references by almost three to one.
Equally interesting was that the anti-Kaushik camp's complaint wasn't necessarily against Avinash Kaushik himself, it was against his “You, too, can do this” mantra (perceived if not actual). Yet another schism appears; those who need (for whatever reason) analytics to be hard and those who need it to be easy.
Eric Peterson is well known for his “Web Analytics is hard” statement (interesting Reading Virtual Minds Vol. 1: Science and History tie in; the majority of respondents wrote web analytics or search. Very few capitalized online analytic disciplines. Most people capitalize their own discipline. It demonstrates a non-conscious recognition of the value of what they do).
This belief begs the question of whether or not something can be “hard” (meaning “difficult”) if it is properly understood. Educational Psychology, Cognitive Psychology, Sports Medicine, Kinesiology and related disciplines all demonstrate that anything done improperly is hard. Many people give up on mathematics due to poor teachers, poor curriculum, lack of discipline, … To them, math is hard. Aikido is dangerous without proper instructors present.
But is something in and of itself difficult? Only if there's a social or political reason for it to be so. Perhaps the priests wish to keep the mysteries of the divine for themselves. This provides them the opportunity to select who'll enter their ranks, who'll excel, and to whom the teachings will be “difficult”. Only one respondent offered a centralizing attitude (“I'd rather be of the school of thought that web analytics can be easier… if given time and approached in the right way.”).
Here again politics more than psycho-economics rears its head. “I will protect my (place in the) industry by making it difficult for others to succeed in that industry” hence controlling the industry itself. The problem with this ethos is that eventually a large enough (ne' “threshold”) group will occur that takes the industry in some other direction completely.
There are psychologic ramifications to both “hard” and “easy” statements. “Hard” statements set up the majority of participants to fail, or if not to fail then to prepare for failure rather than success. Likewise, the “easy” statement can cause false expectations of success to develop. What is obvious from the responses is that Avinash Kaushik owns the “actionable outcomes” space and neuro- and psycho-linguistic Towards space when it comes to online analytics as a discipline (his was the only work directly quoted in the responses; “Actionable insights and metrics are the uber-goal simply because they drive strategic differentiation and a sustainable competitive advantage.”) and Eric Peterson owns the neuro- and psycho-linguistic AwayFrom space when it comes to online analytics as a discipline.
AwayFrom and Towards are used in their neuro- and psycho-linguistic sense here to describe how people hence the industry is thinking, not necessarily how the industry is moving. See AllBusiness.com's Chris Bjorklund interviews viral marketing expert Joseph Carrabis, founder of NextStage Evolution, Part 4a) and Using Sound and Music on Websites for more on these concepts.
The exemplar messaging is polarizing an industry already divided by a great many other factors. I can say playing guitar is easy and I know I'm never going to be a Segovia or Kottke. Likewise, I recognize I could play better if I practiced more. This “centering of duality” needs to take place in the online analytics world if it is to survive, yet most respondents demonstrated extremum statements (statements with language demonstrating polarity behavior and belief) rather than centering statements (statements with language demonstrating unifying or centering behavior and belief) in their responses.
All things require some degree of practice before facility in their use is obvious. There's also the intersection of lack of correct practice and lack of understanding. This can be mixed into The Impostor Syndrome mentioned earlier (see Reading Virtual Minds Vol. 1: Science and History or I'm the Intersection of Four Statements for more on The Impostor Syndrome). Anything can be difficult if the practitioner doesn't really understand what they're doing, is acting by rote but from neither repetitive action nor repetitive practice of the correct action.
Disciplines may be represented by exemplars and responses to the exemplars are sometimes not the responses to the discipline. Respondents tended to present AwayFrom behaviors regarding Avinash Kaushik and Towards behaviors regarding Eric Peterson in their responses (noting as offered earlier than Avinash Kaushik is more in their consciousness than is Eric Peterson and with basic normalization applied).
These presentations are understandable. Correct or not, the perception is that Avinash Kaushik wants to move the industry away from a “numbers are evidence” basis (one respondent offered “And then there's this vague notion from Mr.Kaushik: give more insights, knowing more about what's going on within your visitors minds & hearts so that you can better service them. Sure, cool, sounds great. Still scratching my head. With surveys you say? Asking them a question? Just 4 questions? Ok so when I get the answers, is this representative? Should it influence my copywriting, my product offering, my pricing scheme?”)
The concept of “knowing more about what's going on within your visitors hearts and minds” is one I and NextStage strongly encourage.
You're shocked, I know. Simply shocked.
I also encourage evidentiary — hence numbers based — decision making practices.
A curiosity of this research is that no exemplars arose on the search side of online analytics. Search respondents noted Avinash Kaushik and none of their own. This could be due to the different lifespans of search and web analytics, the different mentalities and ego structures that arise in these two disciplines or simply that no one in search demonstrates a strong enough personality for a cult-of-personality to develop around them.
“How you measure success depends on how you define success”
There are many ways to interpret the above and all of them point to a lack of standardization. I remember conversations where the definition of success was moving away from online sales to “I got their name” or “they downloaded a paper”. These conversations always intrigued me because they were examples of defining success in terms of the visitor's action, not the desired outcome of the site owner.
This is another example of non-standard definitions plaguing an industry and no one stepping up to lead the way (equally interesting, no respondents mentioned any professional organizations in their communications. This indicates online analytics professional organizations are not serving their membership enough to warrant conscious recognition). Online analytics is quite capable of comparing the numbers between “sales” and “newsletter signups” and the comparison truly is one of apples and oranges; business development versus transactional business, strategic vision versus “I went to the bank today” tactics.
And if the success definition the consultant is comfortable with, knows how to demonstrate and can defend is one the business client can no longer accepts?
“The consensus among industry leaders is that web analytics will be a different entity in five years.”
Clients are asking for more … something … from their vendors. One respondent stated “Procter&Gamble is moving from 'eyeballs' to 'engagement' but leaving 'engagement' for others to define.”
This is the intersection of Jim Sterne's “how you measure success” mantra with the “gut vs numbers” statements above. The only sure winner of letting others define your success is that politics will prevail.
A recent Forrester paper indicates a move towards free analytics over for-pay analytics. The report is interesting and perhaps more interesting when viewed outside the online analytics silo.
I point out in Reading Virtual Minds Volume 1: Science and History that growth numbers can seem impressive until you recognize population dynamics, population ecology and evolutionary rescue at work. I used these and similar concepts in From TheFutureOf (13 Mar 09): The Analytics Ecology and From TheFutureOf (5 Jan 09): Omniture and Google Considered Environmentally to indicate that populations would shift, go near death then bounce back dependent entirely on the existence of (again) threshold populations (I hope readers appreciate how important the threshold population concept is in any socio-environmental dynamic).
Conclusions for Part 1
In the end, it seems the online analytics world is setting itself up to fail. It's as if an architect were to create a negative space then attempt to fill it. Analytics doesn't matter be it search or web; all business — B2C, B2B, B2whatever and whatever platform you're using — is going to come down to personal relationships, establishing them, maintaining them, personal interaction and commitment (readers who've heard or seen my “10 Must Messages” presentation will recognize those communications here).
Nothing communicated by any respondents indicated that analytics is in and of itself a worthless discipline, only that it is a misunderstood hence misguided discipline in the online world. Yes, all forms of analytics will get you to the door (and in some cases may even open the door) and in the final conclusion it will be the establishment and demonstration of trust that powers commerce, not numbers. Or at least not numbers alone. This indicates a shift
- in what the numbers are about,
- how they are demonstrated,
- how to derive actionable meaning from them and
- how accountability is framed
are in the offing.
Problems are (in my experience) pretty easy to discover. Solutions, though…
(more on possible solutions in next month's post)
Have you read Reading Virtual Minds Volume I: Science and History? It's a whoppin' good read.