Articles Posted in Tools

Published on:

In the sciences, a recent movement is often referred to as “reproducible research.”  What it  espouses is a philosophy of transparency regarding data and analysis – share the data you collected, what you did to it, and how you did calculations and graphics.  Those who conduct surveys, for example, should make every step of what they did clear to others and available to them for review.  They should explain how they gathered their data, what they did to prepare it for analysis, the steps they carried out in the mathematical analyses and then, of course, their conclusions.

 

In the sciences, reproducible research has gone even further to make the actual data sets available to others.  Unfortunately, too many times scientific findings have failed to be corroborated by others.  Indeed, there have been some well-publicized instances of fraudulent research, fake data, and unsupportable conclusions.  That sort of check on quality is possible only if someone else can follow your tracks.

 

To the extent that law department data developed by vendors, consultants, and academics is used to produce findings, reproducible research should be the aspiration.  We may not be able to go so far as to expose the actual proprietary data that is collected, but all of us can go much farther than we do now to explain how the data was collected and what was done with it.  Explain your methodology!  Moving in that direction would improve the quality of findings in the result reliability of results.

Posted in:
Published on:
Updated:
Published on:

Mostly for lack of a better way to classify companies, benchmark surveys ask respondents to choose from a list of “industries.”  We see those lists all the time: manufacturing, technology, pharmaceutical, and so on.  In the messy real world, we all realize, companies are not so neatly boxed and defined.  Indeed, almost any company of much size does business in what could be considered more than one industry.

One way to measure diversification and therefore more accurately study the effects of industry on legal staffing and spending would be to make use of an entropy calculation.  The entropy of a company where P is the proportion of firm sales in SIC code i for a firm with N different four-digit SIC business units,

Total entropy = ∑ Piln(1/Pi) with N over the sigma symbol and i=1 underneath.

Posted in:
Published on:
Updated:
Published on:

Software is now available that could take all the blog posts on GC Metrics’ Law Department Management and all the articles written in the past five years and all the books about leading law departments and analyze their contents.  A combination of algorithms that use machine learning, network analysis, data mining techniques, and graphics could enable new understandings of the prevalence of ideas about management in corporate legal settings. These tools, which involve statistical parsing and aggregation of large amounts of text, could give us a different picture of how ideas generate, spread, and become mainstream, marginal, or moribund.

For example the notion (and the term itself) of convergence might have first appeared in the early 1990s but its frequency peaked by a decade later – or did it.

Comments posted on social networks such as LinkedIn and Legal OnRamp could be ore for this mine.  With all that material available, analysts could track the use of words over time come, compare related words, and graph them.  Think of one form of the output as a concept geneology.

Published on:

Several posts on this blog have laid out criticisms of various well-known management tools.  This first post of two starts with a half dozen.  For each of the six tools listed below I begin with one or more references to previous posts that give drawbacks of the tool.  I continue with citations to any recent posts regarding the tool published after earlier metaposts.

Brainstorming (See my post of Dec. 31, 2008: criticisms and suggestions; and Jan. 28, 2011: brainstorming replaced by techniques based on neuroscience.).

Delphi process (See my post of Aug. 25, 2009 #3: criticisms of the technique; June 27, 2012: peer pressures as another criticism of the Delphi technique; Dec. 9, 2005: Delphi method (nominal group technique); Feb. 1, 2006 #1: origins of the method; and June 16, 2011: conclusions derived through the Delphi technique.).

Posted in:
Published on:
Updated:
Published on:

Continuing my series on the pitfalls of popular management tools, I offer some for process maps (See my post of Aug. 28, 2005: some criticisms; Aug. 6, 2010: components of process improvement; Sept. 22, 2010: compared to procedure guides; Nov. 19, 2010: contrary to a Romantic view of management; March 14, 2011: can lead observers to feel legal practice is too rote; May 31, 2011: EBay and process maps by procurement; and June 9, 2011: one Six Sigma tools.).

Pick trivial processes that no one really cares about or will make no material difference to the effectiveness of the law department.

Ladle on lots of description, but no prescription.  Spend hours saying what happens but not even minutes suggesting what should happen differently and better.

Posted in:
Published on:
Updated:
Published on:

Intriguing and intuitively attractive, scenario planning has a buzz about it.  This blog has several posts that refer to it, but as yet I have not challenged the technique (See my post of Aug. 25, 2009: uses of scenarios in legal departments with 18 references; March 1, 2010: elaborate scenarios at Microsoft; Aug. 15, 2011: scenarios as training tools; and Dec. 1, 2011: base rate neglect when we think about scenarios.).  Here are some of my doubts.

  1. Little empirical evidence exists that compares the vaunted method to other methods of strategic planning.  We always read about Royal Dutch Shell, but not much else.
  2. Participants may not be able to think creatively, out of the box.  They are much more likely to extrapolate from the present but in a world full of unanticipated interactions, straight-line projections often go awry.
Posted in:
Published on:
Updated:
Published on:

In my consulting projects, I am not a fan of focus groups.  Here are some of my concerns about them.

  1. They waste many people’s time.  Members sit for stretches of time and have little to contribute.  The moments of value to others sometimes represent a small portion of their blocks of time.
  2. They do not make candid, full comments.  People do not want to bring up sensitive topics or appear critical of others.  Subordinates lapse into silence.  Members may not say what they think nearly as openly as when they are in a confidential, one-on-one interview or on an anonymous survey.
Posted in:
Published on:
Updated:
Published on:

From time to time this blog mentions where several law departments have banded together to push an initiative. I refer to them as collective actions and praise them. Progress would be made on several management fronts if fellow-traveler departments more often combined their resources.

To be helpful, I wrote an article about collective action by law departments and what some of the foreseeable consequences might be of its spread. You are invited to read my collective action article, published in the National Law Journal or to

Posted in:
Published on:
Updated:
Published on:

Back in the mists of time, I wrote about the 18 tools that general counsel most commonly use (See my post of April 14, 2005: based loosely on Bain’s annual studies of tools.).

Seven years later, I would no longer include balanced scorecards, employee satisfaction surveys, psychometric tests and retainer letters. Recreating what my thought process was back then when I included them gets me nowhere. Today I recant: those four tools rarely show up in law departments.

As to three other tools, I am dubious, to put it mildly, about including client satisfaction surveys, mission statements, and strategic plans. Although they have their adherents, those tools hardly rise to the level of “common.”

Posted in:
Published on:
Updated:
Published on:

A tool to help communicate about and plan for a complicated project is what some analysts call a design structure matrix (DSM). As described in the Harvard Bus. Rev., Oct. 2011 at 106, in a DSM, “a project’s tasks are listed along the rows and columns of a matrix, and the team marks whether each item is related to the others, designating each relationship as either a direct dependency or a feedback loop.”

Matrix algebra can than calculate a recommended order for the tasks or a simpler spreadsheet function can show the interactions. DSM sounds like industrial-grade GANTT charting, but for hefty projects run by legal departments, a watered-down version might be helpful.

I have written about matrices, which are sometimes simply tables or grids (See my post of March 10, 2005: Johns Manville’s RAPP matrix; May 28, 2005: a spending matrix; May 3, 2006: PetSmart’s complex matrix; Feb. 1, 2007: nine-box tool; March 25, 2009: grid analysis methods; Aug. 4, 2009: Laffey matrix; Aug. 9, 2010: law firms put in portfolio matrix; Sept. 27, 2010: patent lawyers team with colleagues to prepare an opportunity matrix; and Nov. 8, 2010: contract approval matrix.).

Posted in:
Published on:
Updated: