Articles on Tools

Reproducible research as an aspirational goal for legal metrics

In the sciences, a recent movement is often referred to as “reproducible research.”  What it  espouses is a philosophy of transparency regarding data and analysis – share the data you collected, what you did to it, and how you did calculations and graphics.  Those who conduct surveys, for example, should make every step of what they did clear to others and available to them for review.  They should explain how they gathered their data, what they did to prepare it for analysis, the steps they carried out in the mathematical analyses and then, of course, their conclusions.

 

In the sciences, reproducible research has gone even further to make the actual data sets available to others.  Unfortunately, too many times scientific findings have failed to be corroborated by others.  Indeed, there have been some well-publicized instances of fraudulent research, fake data, and unsupportable conclusions.  That sort of check on quality is possible only if someone else can follow your tracks.

 

To the extent that law department data developed by vendors, consultants, and academics is used to produce findings, reproducible research should be the aspiration.  We may not be able to go so far as to expose the actual proprietary data that is collected, but all of us can go much farther than we do now to explain how the data was collected and what was done with it.  Explain your methodology!  Moving in that direction would improve the quality of findings in the result reliability of results.

The “industry” of a company and a way to create an index of diversification with entropy measurements

Mostly for lack of a better way to classify companies, benchmark surveys ask respondents to choose from a list of “industries.”  We see those lists all the time: manufacturing, technology, pharmaceutical, and so on.  In the messy real world, we all realize, companies are not so neatly boxed and defined.  Indeed, almost any company of much size does business in what could be considered more than one industry.

 

One way to measure diversification and therefore more accurately study the effects of industry on legal staffing and spending would be to make use of an entropy calculation.  The entropy of a company where P is the proportion of firm sales in SIC code i for a firm with N different four-digit SIC business units,

 

Total entropy = ∑ Piln(1/Pi) with N over the sigma symbol and i=1 underneath.

 

That formula is a weighted average of the segments’ sales share, the weight for each segment being the logarithm of the inverse of its share. The measure thus takes into consideration two elements of diversification: (1) the number of segments in which a company operates; and (2) the relative importance of each segment in the total sales of the company.

 

A hypothetical company, called EntropyCo, has two-thirds of its revenue coming from one four-digit SIC code business unit and the other third from a different SIC code business unit.  The total entropy formula would then be .66 times the logarithm of 1/.66 plus .33 times the logarithm of 1/.33.  That equals .278.  A comparable company that earned three quarters of its revenue from one business unit and one quarter from the other would have an entropy figure of .2557, which is lower, as it is less diversified than EntropyCo.

 

With the tool of entropy, if participants in a benchmark survey were to break down their revenue by more than one industry, the benchmark metrics could be more finely calibrated.

 

*******************

Take the GC Metrics 2013  survey; https://novisurvey.net/n/GCMetrics2013.aspx.  The no-cost survey asks for 2012 number of lawyers, paralegals, and other staff; inside and external legal spend; and revenue.  You will receive the Winter Release with more than 1,100 law departments.

The arrival of text mining and its implications for tracking ideas important to law department management

Software is now available that could take all the blog posts on GC Metrics’ Law Department Management and all the articles written in the past five years and all the books about leading law departments and analyze their contents.  A combination of algorithms that use machine learning, network analysis, data mining techniques, and graphics could enable new understandings of the prevalence of ideas about management in corporate legal settings. These tools, which involve statistical parsing and aggregation of large amounts of text, could give us a different picture of how ideas generate, spread, and become mainstream, marginal, or moribund.

 

For example the notion (and the term itself) of convergence might have first appeared in the early 1990s but its frequency peaked by a decade later – or did it.

 

Comments posted on social networks such as LinkedIn and Legal OnRamp could be ore for this mine.  With all that material available, analysts could track the use of words over time come, compare related words, and graph them.  Think of one form of the output as a concept geneology.

 

These ideas came from an article in the NY Times, January 27, 2013 at BU 3. They would allow people with experience in running corporate legal teams to see their conceptual world from a different, quantitative perspective.  Consultants could make use of this material and there would be ample benchmarking opportunities

Debunking some management tools general counsel might consider using

Several posts on this blog have laid out criticisms of various well-known management tools.  This first post of two starts with a half dozen.  For each of the six tools listed below I begin with one or more references to previous posts that give drawbacks of the tool.  I continue with citations to any recent posts regarding the tool published after earlier metaposts.

 

Brainstorming (See my post of Dec. 31, 2008: criticisms and suggestions; and Jan. 28, 2011: brainstorming replaced by techniques based on neuroscience.).

 

Delphi process (See my post of Aug. 25, 2009 #3: criticisms of the technique; June 27, 2012: peer pressures as another criticism of the Delphi technique; Dec. 9, 2005: Delphi method (nominal group technique); Feb. 1, 2006 #1: origins of the method; and June 16, 2011: conclusions derived through the Delphi technique.).

 

Focus groups (See my post of July 20, 2012: 8 criticisms of focus groups.).

 

Process maps (See my post of Aug. 28, 2005: some criticisms; and July 24, 2012: additional criticisms of process mapping.).

 

Scenario planning (See my post of July 20, 2012: 8 problems with scenario planning efforts.).

 

SWOT analysis (See my post of Aug. 28, 2005: five criticisms of the method; Dec. 9, 2006 #1: SWOT review of large legal department discloses risk aversion; Jan. 13, 2006 #3: SWOT’s historical roots; Nov. 5, 2007: SWOT analyses that glide over the W[eaknesses]; and Nov. 7, 2007: strengths, weaknesses, opportunities and threats.).

Various ways to get lost with law-department process maps

Continuing my series on the pitfalls of popular management tools, I offer some for process maps (See my post of Aug. 28, 2005: some criticisms; Aug. 6, 2010: components of process improvement; Sept. 22, 2010: compared to procedure guides; Nov. 19, 2010: contrary to a Romantic view of management; March 14, 2011: can lead observers to feel legal practice is too rote; May 31, 2011: EBay and process maps by procurement; and June 9, 2011: one Six Sigma tools.).

 

Pick trivial processes that no one really cares about or will make no material difference to the effectiveness of the law department.

 

Ladle on lots of description, but no prescription.  Spend hours saying what happens but not even minutes suggesting what should happen differently and better.

 

Become obsessed with the software’s treasures for creating, revising and polishing the format of the process maps rather than their content or effectiveness.  Some people become addicted to PowerPoint’s clever formats and tricks but fail to think about how to make the slides useful.  So too, the packages that create clever shapes and fancy arrows can steal you away from insights.

 

Ignore metrics so that elapsed time, numbers of events, counts of output, handoff totals, and the like are absent.  Without metrics, maps are much less useful.

 

Let them collect dust afterwards.  A feckless process map, one that doesn’t reflect real life and guide people in what to do was not worth doing.  But a well-thought out process map that disappears into a drawer brings no value either.

Criticisms of scenario planning as undertaken by law departments

Intriguing and intuitively attractive, scenario planning has a buzz about it.  This blog has several posts that refer to it, but as yet I have not challenged the technique (See my post of Aug. 25, 2009: uses of scenarios in legal departments with 18 references; March 1, 2010: elaborate scenarios at Microsoft; Aug. 15, 2011: scenarios as training tools; and Dec. 1, 2011: base rate neglect when we think about scenarios.).  Here are some of my doubts.

 

  1. Little empirical evidence exists that compares the vaunted method to other methods of strategic planning.  We always read about Royal Dutch Shell, but not much else.
  2. Participants may not be able to think creatively, out of the box.  They are much more likely to extrapolate from the present but in a world full of unanticipated interactions, straight-line projections often go awry.
  3. Interactions are not considered.  It is one thing to assume one change and carry out the implications; it is much more difficult yet realistic to introduce other things that would respond to that change and interact with it.
  4. The scenario planners probably don’t go out far enough.  Major trends take a long time to unfold, yet most people don’t have visibility several years out.
  5. To imagine the future is especially challenging for lawyers, many of whom are risk-averse, skeptical, competitive, and find it easy to attack someone else’s “far-fetched” idea.
  6. Some planning exercises, on their face neutral and objective, are in fact ideological, political, power plays aimed to reach a predetermined end.
  7. The leaders of the group, and the members, may make no effort to put on the table what they take for granted or will not address elephants in the room.  Unstated or unchallenged assumptions warp the accuracy of scenarios.
  8. Like all group efforts, success is more likely if a skilled facilitator guides the group and even more so if there is training for the group members.

Downsides of focus groups as undertaken by law departments

In my consulting projects, I am not a fan of focus groups.  Here are some of my concerns about them.

 

  1. They waste many people’s time.  Members sit for stretches of time and have little to contribute.  The moments of value to others sometimes represent a small portion of their blocks of time.
  2. They do not make candid, full comments.  People do not want to bring up sensitive topics or appear critical of others.  Subordinates lapse into silence.  Members may not say what they think nearly as openly as when they are in a confidential, one-on-one interview or on an anonymous survey.
  3. They end up being too scripted.  Often junior consultants conduct focus groups.  They stick to the set of questions given them, come hell or high water.
  4. Groups need a skilled facilitator as well as guidelines and quite possibly training of the members.  A focus group that makes progress rarely blossoms on its own.
  5. They can be hard to keep on track, with a progressive tempo, if the sessions take place weeks apart.  Scheduling difficulties can extend the elapsed time and cause people to miss sessions.
  6. They can under-perform because whoever sets them up makes a poor choice of group members.
  7. They can flounder because they have an unclear purpose or detour off track.
  8. They can be afflicted with dysfunctional group dynamics.

 

For other posts from this blog about focus groups (See my post of April 6, 2009: brainwriting as a tool for focus groups; Jan. 7, 2010: tool to assess employee engagement; and Jan. 23, 2012: one of the important management concepts.).

Collective action by groups of legal departments – potential large but realization small

From time to time this blog mentions where several law departments have banded together to push an initiative. I refer to them as collective actions and praise them. Progress would be made on several management fronts if fellow-traveler departments more often combined their resources.

To be helpful, I wrote an article about collective action by law departments and what some of the foreseeable consequences might be of its spread. You are invited to read my collective action article, published in the National Law Journal or to
Download 12-01-09 collective action by law depts NLJ

Seven revisions of the most common management tools of general counsel, based on my 2005 list of 18

Back in the mists of time, I wrote about the 18 tools that general counsel most commonly use (See my post of April 14, 2005: based loosely on Bain’s annual studies of tools.).

Seven years later, I would no longer include balanced scorecards, employee satisfaction surveys, psychometric tests and retainer letters. Recreating what my thought process was back then when I included them gets me nowhere. Today I recant: those four tools rarely show up in law departments.

As to three other tools, I am dubious, to put it mildly, about including client satisfaction surveys, mission statements, and strategic plans. Although they have their adherents, those tools hardly rise to the level of “common.”

Thus, from the original list the common coin of general counsel in the management arena remain benchmarking, internal-expense budgets, matter-specific budgets for outside counsel, competitive bidding by firms, convergence, document management systems, electronic billing, intranet sites for law, matter management systems, off-site retreats, and personnel evaluations.

Today I would replace the dropped tools with internal discovery teams, litigation hold processes, matrix reporting, mentoring, portals, preferred law firms, and succession plans. Stay tuned for me update in seven years.

A design structure matrix (DSM) to plot how a complex project should proceed

A tool to help communicate about and plan for a complicated project is what some analysts call a design structure matrix (DSM). As described in the Harvard Bus. Rev., Oct. 2011 at 106, in a DSM, “a project’s tasks are listed along the rows and columns of a matrix, and the team marks whether each item is related to the others, designating each relationship as either a direct dependency or a feedback loop.”

Matrix algebra can than calculate a recommended order for the tasks or a simpler spreadsheet function can show the interactions. DSM sounds like industrial-grade GANTT charting, but for hefty projects run by legal departments, a watered-down version might be helpful.

I have written about matrices, which are sometimes simply tables or grids (See my post of March 10, 2005: Johns Manville’s RAPP matrix; May 28, 2005: a spending matrix; May 3, 2006: PetSmart’s complex matrix; Feb. 1, 2007: nine-box tool; March 25, 2009: grid analysis methods; Aug. 4, 2009: Laffey matrix; Aug. 9, 2010: law firms put in portfolio matrix; Sept. 27, 2010: patent lawyers team with colleagues to prepare an opportunity matrix; and Nov. 8, 2010: contract approval matrix.).