Corporate development services for investors and business owners
As consultant in organizational development, finance and administration, I identify issues for company owners and investors, develop and implement plans to reorganize and stabilize the business, reduce overhead costs and implement new infrastructure, such as accounting, project management and IT systems.
My job is to make my client successful.
My role is to facilitate decisions while remaining in the background. I guide the decision makers and leaders into making the right decisions. I do my best to prevent them from making bad decisions.
Services I have provided in the past include:
- Due diligence pre- and post-acquisition/merger
- Post-acquisition performance evaluations and assessments
- Integration of organizations, systems and processes, including
- Governance systems
- Project management and operational systems
- Financial control and procurement
Contact me for a conversation: firstname.lastname@example.org
Is there a common set of tools that facilitates analysis of absolutely anything and everything? Or, is this notion as ridiculous as an 87-in-1-tool Swiss Army knife? Here are a few useful guidelines:
General Principles: Procedural
Know what others have done before you.
- Chances are that someone has thought about – and written about – the same problem you are analyzing. You needn’t reinvent the wheel. Do your research before you attack a problem.
Know your audience.
- Remember that it only makes sense to communicate the results of your analysis in terms that your readers will understand.
Be open to making mistakes. Record your mistakes.
- It is said that Edison’s light bulb was the result of over 3,000 failures. But he never made the same mistake twice.
Pay attention to your skepticism.
- If something doesn’t seem right, there may be a valuable insight in your skepticism. Investigate your doubts.
Take your time. Get into it. Go away from it. Come back to it.
- This is my personal approach. It may not work for everyone. I like to first immerse myself in the data and have the time to work with it. If I have to rush it at the last moment, well, there are going to be errors.
General Principles: Methodological
Take a Systems approach.
- Thinking about a problem as an element of a larger more encompassing system reveals underlying insights – and helps to ensure that your analytical work isn’t contributing to the overall problems you and your team are attempting to resolve.
Start with the idea that there must be a good reason for why things are the way they are.
- A “good” reason is only a diagnosis.
You will only find that which you measure.
- Question the data. If you are measuring the wrong thing, you likely won’t find ultimately what you are looking for.
Start with the raw data. Not reports, which are someone else’s interpretation of the data.
- If you are working from previously generated reports and tables of data, you are working from someone else’s assumptions, conclusions – and errors.
Either the answers will emerge from the data, or you need to be testing the data for answers you hypothesize.
- This is the difference between finding hypotheses and testing for them. Be certain you know which is which.
Plan for serendipity.
- The unexpected happens. Pay attention and expect it.
- We want to make distinctions. But sometimes it makes sense to accept two propositions that appear contradictory.
Turn the problem on its head. Then, look at it from the side.
- A chicken is an egg’s way of making another egg, as Samuel Butler put it.
- Leaders tend to influence the framing of a problem, as well as its analysis and interpretation. If you disagree, say so.
Beware the Arbitrary.
- Keep in mind that when you name something you’re automatically assigning it a higher degree of relevance.
Don’t read in to the statistics.
- We all have a tendency to want to find cause and effect. And when we can’t see it immediately, we want to believe that the more probable an outcome or relationship, that probability is pointing towards a causal relationship. As any statistician will tell you, it ain’t necessarily so.
Know when to stop.
(c) 2012 Ira A. Greenberg
In those situations when the phrase “analysis paralysis” rolls so easily off a manager’s tongue, the fact that the words rhyme might seem to reveal some particularly true truth: Yes, of course, analyzing things makes them come to a complete standstill. Therefore, it is logical to conclude that regardless of the issues at hand, it is better not to analyze. It is less risky to just keep things moving. However, if you must analyze, do only so much as necessary to meet some requirement.
To an analyst such as myself, it is rather unfortunate that the words rhyme. Hearing someone joke about “analysis paralysis” reveals an organizational anxiety; a fear of losing control, a systemic anxiety that has the potential to stand in the way of progress, positive change and growth.
Analysis and paralysis each have a common root in the New Latin, from the Greek, analyein, to break up or unloosen. Analysis “loosens” hidden secrets – “para” means the side (as in, parallel). So, paralysis is where the side loosens to the point where it ceases to move.
When we analyze data or systems we certainly do want the patterns to reveal themselves – to come loose – to emerge despite the forces that are holding fast breakthrough realizations and insights needed in order to facilitate progress, positive change and growth.
Certainly, no organization can afford for things to stop moving. But management research has shown that organizations which embrace self-analysis are the most successful organizations. The lesson: You need not break it in order to fix it.
It is not analysis in and of itself that causes paralysis – It is the anxiety in the system that associates analysis with paralysis that, in fact, precipitates that feared paralysis.
How to avoid analysis paralysis? Start with a plan to create an analytical organizational culture. Train managers in analytical techniques. Give them the tools they need to challenge the way things are done today, in order to loosen the hidden insights and reveal the way to continued success in the future.
(c) Ira Greenberg 2010
“Son, you can rebuild this carburator from here to eternity and you won’t fix what’s wrong with this car.”
In 1982, while in grad school at the Annenberg School for Communication at Penn, I bought a 1968 Volvo for $800.00. Barry and I drove it up to Canada and back for a conference and the car was never the same – it just didn’t run right. That summer, I decided I was going to fix the car myself, and learn something about how cars worked. I bought the official manual. I got the tools. The place to start was to clean the carburator, so I did that. But the car still didn’t seem to run very well. I took the carburator apart, rebuilt it with new kit parts, put it back together again, installed it – again, the car didn’t work quite right. After several iterations of this rebuilding task, I finally took the car in to a Volvo dealership where an older gentleman in the shop who knew these old models took a look at it. I explained that I had rebuilt the carburator several times and perhaps he could fix the carburator? I stood by while he inspected the car. After a few moments, he looked up at me, rubbed his hands on a rag, and said, “Son, you can rebuild this carburator from here to eternity and you won’t fix what’s wrong with this car.”
The mechanic was my consultant. I was the client who had diagnosed the problem and implemented a program. And when it didn’t work doing it myself, I finally went to a consultant to implement. Great program – wrong problem.
Analysis and evaluation aren’t only appropriate for the end of a project or program. Had my expert been involved from the beginning, I would have saved considerable time and trouble.
“What should I measure, and why?” are the most frequent questions I’m asked by training Managers, VPs and Directors. What I hear them asking is – what value is there in collecting data beyond the traditional “smile sheets” completed by students after each training session? Were they to invest in the analytical services I offer, how would their training program benefit? Would their sales people be more productive? They are challenging me to prove it.
First – the Why?
Organizations which evaluate the effectiveness of their learning & development program tend to be organizations which have the more effective learning & development programs.
This isn’t to suggest that measurement, assessment and evaluation automatically improve training programs. Rather, it is a statement about “Learning Organizations.” According to Peter Senge (1990: 3), who coined the phrase, learning organizations are: “…organizations where people continually expand their capacity to create the results they truly desire, where new and expansive patterns of thinking are nurtured, where collective aspiration is set free, and where people are continually learning to see the whole together.” Here is the value of measurement, assessment and evaluation: Becoming an organization where people continually expand their capacity to create the results they truly desire. Measurement and analysis provide the feedback necessary for intentional systematic improvement.
As a career analyst with a graduate degree in social science research and 25 years of experience as a business analyst and consultant in finance and organizational development, my natural tendency is to want to measure everything, analyze all the data I can get my hands on. Numbers tell stories, and – based on the stories the numbers tell – I provide my clients with insights into training and performance improvement beyond hunches and gut-feel. But with Learning & Development budgets strained as they are, managers want to know why they should spend more if they don’t have to. I’m not interested in analysis for the sake of analysis, but I am intensely interested in analysis for the purpose of performance improvement, improving the quality and impact of training, to justify the cost of training, and to reduce the cost of training. Analysis of data yields information and insights into performance improvement.
What to measure? Not everything requires or warrants all evaluations. That is, only some – high value – courses/programs would benefit from a full “Level 5″ ROI analysis.
Targeted, focused ROI studies, especially for sales training programs
Training Spending and Staffing Statistics
- Training expenditures per learner
- Training staff to learner ratio
- Percent of spending on L&D staff payroll
- Percent of spending on learning technologies
- Spending allocation by training program area
- Spending allocation by employee type
- Training volume and delivery statistics
- Annual student hours consumed per learner
- Cost per student hour consumed
- Percent of student hours by delivery method
- Number of employees trained, by position title
- Number of courses offered, by content area
- Number of sessions offered, by course, by content area, by delivery method
- Number of registrations compared to number of attendances
- Content development costs
- Time to employee readiness or competence
Analysis of TEACHING
- Quality and effectiveness of:
- Instructional design
Analysis of LEARNING RESULTS
- Pre-learning diagnostic assessments
- Post-learning knowledge, concepts
- Post-learning application/behavior change
Analysis of DELIVERY
- Course session location, time
- Method: Classroom vs. Online vs. Self-study vs. Combination
Analysis of TESTING
- Are your exams valid & reliable?
- Are your exams measuring what you think you’re measuring
- Is your exam fair?
- Will the results of your tests predict future success?
- Do the questions properly test their associated learning objectives?
- Will the assessment correctly indicate that a learner has mastered the material?
- Do your questions cover knowledge and comprehension and application?
What to measure, and why? Measurement, analysis and evaluation in and of themselves are not the value – It is how the insights derived from analysis are used for learning planning and improvement.
(Note on this illustration: This word cloud is a visual content analysis of ASTD’s 2009 report, which I generated through wordle.com.)
I recently reviewed the two key annual studies of trends in corporate Learning & Performance: Bersin & Associates Corporate Learning Factbook 2010, and ASTD’s Annual Review of Trends in Workplace Learning & Performance, the 2009 State of the Industry Report. I found that these studies contradict one another in some fairly significant ways, e.g., Bersin said that companies’ L&D budgets fell precipitously in 2008 and 2009 as a result of “the faltering US economy,” while ASTD said that, “investment in workplace learning and performance was stable in 2008 …”
As in any study, the analysis can only be as good as the sample data available, and conclusions are always an outcome of what you measure. Without going into an in-depth critique of the two organizations’ methodologies, suffice it to say that they measured different things in differing ways. Bersin’s study is limited to US trends and ASTD’s data is global. Bersin is using data from 2009 and ASTD’s report is based on 2008 data. Also, ASTD’s data includes corporate expenditures for tuition reimbursement. These limitations being clear, I wanted to know: What may we as consultants learn from these two sources about significant current general trends in L&P that will help us to serve our clients and meet their needs?
If it is true that there has been a downward trend in spending on training operations by small and large companies – at least in the US, how could that be good news for L&P consultants?
According to Bersin & Associates, training budgets fell in 2009 at the combined rate of 21% from 2007 levels. According to ASTD, from 2007 to 2008 (the latest data available), “The average annual learning expenditure per employee fell from $1,110 in 2007 to $1,068 in 2008, a decrease of 3.8%.” Bersin’s survey puts the 2008 median training expenditure at $714 per learner.
These reductions were no doubt in part the result of reduced employee headcount during the general economic downturn of the past two years: Fewer employees means less training. In 2009, L&D staff also experienced significant reductions – Bersin calculates a 4% reduction for small businesses and 8% for large companies.
And to top it all off, according to ASTD, “Since 2004, organizations have relied less on outsourcing each year. The average percentage of the learning budget allocated to external services was 22.0% in 2008, down from 25.2% the previous year.” Smaller budgets, and a lesser percentage going to outside L&P contractors would suggest a smaller market opportunity for L&P consultants.
So what’s the good news?
Unlike economic downturns in the past, ASTD survey respondents agreed with the statement that, “We placed/are placing a stronger emphasis on learning than the last downturn.” Agreement was 37.9% vs. 24.5% in past surveys. Outsourcing is a strong trend, and it seems that the greatest opportunity is for consultants who offer added value …
Learning & Performance professionals have learned to become more productive with less resources available.
The number of hours of formal learning content available per L&P staff member rose to an average of 353 hours in 2008, reflecting a trending increase over prior years (ASTD). At the same time, the consolidated average cost per learning hour available decreased 8%, from $1,660 in 2007 to $1,528 in 2008 (ASTD). In addition to increased availability of content, in 2008, L&P facilitated a slightly greater average number of used hours per staff member – from 5,497 hours in 2007 to 5,507 in 2008.
ASTD reports a reduction in the average annual learning expenditure per employee; from $1,110 in 2007 to $1,068 in 2008, a decrease of 3.8%.
The trend: L&P organizations are delivering just as much, but with less resources. Both ASTD and Bersin & Associates predict that this productivity trend will continue through 2009 and beyond.
How have Learning & Performance professionals achieved this greater level of productivity?
Some factors are obvious, such as an increase in the average number of employees per learning staff member; 253 in 2008, up from 227 the previous year (ASTD) – L&P associates and organizations are working harder.
Also fairly obvious, is that online delivery of learning content, including remote instructor-facilitated online courses, self-study e-learning, and blended methodologies, ultimately reduced associated delivery and administrative costs such as staff time and travel & lodging expense. (However, I note that ASTD reported a rebound in classroom-based instructor-led learning in 2008 – possibly as a result of companies slowing capital investment in technology due to the down economy.)
Other significant factors yielding greater L&P productivity – factors which are more difficult to quantify – included:
- Improved matching of the right training to the right employees, yielding greater impact and less waste, or “scrap” training;
- A sharpened focus on training employees in their specialized competencies, as well as an emphasis on product related training content in order to yield greater sales productivity;
- Enhanced repeatability of training programs, less time required by L&P professionals to prepare courses, train-the-trainer initiatives, and the reuse of materials through in-house and third-party online Learning Management Systems;
- Continued centralization of training operations through corporate services, providing greater learning opportunities across multiple lines of business;
- Better alignment of training initiatives to corporate strategic priorities;
- Continuous improvement of training courses, materials and instructor/facilitators;
And, perhaps most importantly:
- According to Bersin, the “use of external instructors and facilitators remained the largest area of outsourcing, followed by use of external content developers, which was up significantly in 2009.” ASTD’s study directly contradicts this finding: “Since 2004, organizations have relied less on outsourcing (spending on external services such as consultants, workshops, and training sessions from outside providers) each year. The average percentage of the learning budget allocated to external services was 22.0% in 2008, down from 25.2% the previous year.” Perhaps the difference between the two studies is Bersin’s US-based data, and ASTD’s international data.
What are the trends for the near future?
- Chief Learning Officers are increasingly recognizing the value of testing and evaluation in the continuous improvement of Learning & Performance programs.
- Learning & Performance professionals have acknowledged the real importance of informal learning networks – most corporate learning takes place outside of formal classroom settings and online courses. Recognizing the importance of coaching, mentoring and online social networking, L&P planners are looking for creative and effective ways to use new technologies in their learning programs.
So what’s it all mean for L&P consultants? Both studies go into much greater detail than is covered in this brief article. But what is clear from this information is that as consultants, we need to continue adding value beyond content development in ways that will enhance our clients’ productivity and efficiencies, as well as helping them to creatively align learning initiatives with their company’s strategic goals and objectives.
Learning & Development
Testing, Assessment & Evaluation Consulting
- Kirkpatrick/Philips: Level I-IV test results analysis and evaluations
- ROI: Return on training investment analysis
- Training Program Annual Review
- Reports & Dashboards
- Online Systems Implementation
- Pedagogue Solutions
- Knowledge Advisors; Metrics that Matter