In the last edition of The CAD Manager’s Newsletter I mentioned having a great conversation with a couple of CAD managers and promised we’d cover one of the big topics we discussed. That topic is how to audit your CAD practices and optimize performance in the organization. The actual question asked was this:
Our management wants us to do an audit of our CAD use and practices and recommend how to make things better. So, how do we go about doing that?
In this edition of The CAD Manager’s Newsletter, I’ll share a synopsis of our discussion about this topic. Here goes.
Image source: WrightStudio/stock.adobe.com.
In my experience, any CAD performance audit begins because there are problems (errors) that need to be addressed. So the logical place to start is to identify what the problems are. After all, if you’re trying to get more efficient doesn’t it stand to reason that eliminating problems would be the goal? It also turns out that when everybody is focused on eliminating the problem, then there is a common purpose to doing the audit. The natural progression then becomes finding where errors happen then identifying the root problem causing the error.
To start the process here are some steps I’ve found always work when trying to root out errors:
Ask CAD users where the problems are. They’ll tell you. Of course they’ll tell you what the problems are from their production point of view, but you’ll have a good starting point to begin the audit.
Ask senior and project managers (PMs) where they think the problems are. They’ll tell you something different than the CAD users will as they’re focused almost entirely on deadlines and customer satisfaction.
Ask IT if they see any problems with CAD tool usage. They’ll give you an entirely different spin on the issue that’ll focus on security, costs, and licensing.
If you keep track of all the responses, you’ll have a balanced assessment of your CAD problems from all different perspectives. In my experience, you can’t solve CAD problems by focusing on any one perspective (user, PM, or IT), rather you must consider all stakeholders.
Now that you have a list of problems from all stakeholders, it’s time to start correlating and looking for common threads. Here’s the hierarchy of filters I use along with a few examples:
Problems listed by everyone. If everybody is telling you that you that capturing PDF output for submittals is an issue, then it probably is. Of course, users may talk about standards configurations, PM’s will site missed deadlines, and IT may talk about configuration with Bluebeam, but the point is you have some consensus to work with.
Problems listed by users and PMs. If users and PMs are all telling you that sending data back and forth to clients is problematic, then it probably is. The problem may be with different software versions, intermediate formats like IFC, or simply data standards.
Problems listed only by PMs. If PMs tell you that meeting client timelines are a problem, but CAD users do NOT tell you, then the situation is usually that the CAD users don’t know until too late — a simple communication problem.
Problems listed only by CAD users. If users tell you that sending information to clients is like pulling teeth because there’s no standard way to do so, but PMs say nothing about the issue, then users are typically solving the problem on their own.
Problems listed only by IT. Typically security issues related to cloud apps or license server issues that CAD users may not even know are causing them problems.
Obviously, this isn’t a comprehensive list of problems you’ll encounter, but in my experience it covers the majority. Do not go any further in your CAD auditing process until you’ve filtered and collated your findings.
Of course, to solve any problem you must offer a plausible solution, right? Sometimes solutions are simple (fix a standard, properly configure drivers, etc.) but sometimes solutions become needlessly biased toward software. After all, if we’re having problems with our CAD tools then we should probably just buy new software, right? No! Let me illustrate by sharing two examples of client audits I’ve been involved with and you’ll see what I mean.
Company A. Some years ago I started working with a client who’d recently botched their CAD audit. They decided to drop the tools that they had to pursue higher efficiency by purchasing the most advanced 3D modelling and machining software on the market. They spent a ton of money only to find that the new design and production processes they put in place were worse than the old system. They had to revert to their old software and admit an expensive defeat.
It turns out that the true problems they had were inefficient workflows, poor training, and an abject lack of data and filing standards. No matter what software they adopted, they would have the same problem as before because the software wasn’t the problem in the first place. Our subsequent audit brought these issues to light and we diligently worked to eliminate the problems using the software environment they already had.
Company B. I currently work with a company that manufactures extremely complex plastic and bent metal tube furniture using only a few seats of Inventor, mostly AutoCAD, and a variety of older CNC and tube bending software tools customized to their precise needs. They’ve raised CAD productivity by a factor of 8X in the 20 years I’ve worked with them because they are in a constant state of internal audit based on manufacturing errors.
The only thing this company cares about is producing quality products with the fastest delivery and lowest error rates possible. If buying new software supports those goals, then they purchase it and implement it properly. On the other hand, if using the tools they already have but more intelligently customized is the answer, then that’s what they’ll do. Note that they aren’t married to any CAD tools, they just use what works.
My point is that any solution you offer must be strictly based on what it takes to fix the problem no matter what. At the end of the day, you’ll need to defend your solution so it should be factual, practical, and results-oriented.
Based on what you’ve done so far you probably have a gut feel for where the problems are and how you’d propose to solve them. It is now time to pull everything together to create a preliminary report which includes the following elements:
Problems found. As listed above.
Problem severity. Based on which problems cause the most disruption in production. This is not what annoys users or PMs the most — it is what is causing work to be delayed or in error.
Problem stakeholders. Defines what groups must participate in the solution — users, PMs, IT, or a combination of all of them.
Proposed solutions. This is simply your conceptual framework for how these problems might be solved considering what users, PMs, senior management, and IT must all do to solve the problem together.
Proposed costs. What will your solutions require in terms of labor, training, IT time, software utilities, etc. Management will want to know this so it behooves you to have a good idea of the numbers now.
Ask for next steps. Since everyone will need to be involved in solving the problems you’ve found why not ask everyone for their feedback? You’ll need their involvement later so why not get them on the record now to insure their engagement later.
Now, you wait for the feedback to roll in and follow up with extensive communication. Since you’ve identified the problems by talking to all stakeholders, you should now be able to start making progress. Do this as long as meaningful feedback is coming in.
Once the wait and communicate phase comes to its logical conclusion, then issue your final report to management and wait for approval or cross examination.
The process of auditing your CAD processes requires effort from everyone and a unified push for solving problems. As CAD manager, you’re in a unique position to collect the data and correlate it to obtain the best possible results after the audit is done. Just use the basic process I’ve outlined above and you can’t go wrong. Until next time.