Wednesday, October 16, 2019

Creating a Successful Data and Analytics Program - Select the organizational structure (Part 3 of 6)

This is part 3 in a 6-part series about creating a successful Data and Analytics program. To read parts 1 or 2, please follow the links below.
And now on to part 3...

Step 2:  Select the organizational structure

It is obvious from the outset that a business must create some type of organizational structure around a Data and Analytics initiative as it requires the full-time effort of at least a small group of colleagues. There are four choices when selecting which organizational structure is right for your company, and all of them can lead to successful outcomes.
  1. Business unit led - When business units have distinct data sets and scale isn’t an issue, each business unit can make its own Data and Analytics decisions with limited coordination. This approach is very flexible but will typically lead to independent silos of data. This may or may not be an issue depending upon the structure of the business units within the company and their need or desire to share data. It is possible that data independence is a key driver to success. This approach can also be more expensive if multiple teams are building their own programs. That said, the cost can be minimized by at least having a central technical platform for everybody to share. Thus, it would just be the "business" portion of the program that would differ across units.
  2. Business unit led with central support - Business units make their own decisions but collaborate on selected initiatives. This is a slight modification to the first approach where the business units occasionally come together on specific use cases. This collaboration is a good way to standardize certain data. Let us say, for example, that multiple business units accept customer complaints and record them. There is benefit to standardizing the codes for the set of possible complaints as well as the business processes for accepting and recording said complaints so that the metrics can be compared and contrasted across the business units. Because there is no full-time, central team identifying opportunities for and governing these collaborations, corporate leadership as well as the leaders of the various business units need to be committed to this collaboration. It is easy to miss an opportunity or for business units to diverge again over time.
  3. Center of Excellence - An independent center oversees the company’s Data and Analytics while business units pursue initiatives under the CoE’s guidance and coordination. This is the middle-of-the-road approach where a smaller, central team takes on the responsibility of identifying opportunities for collaboration and governing the use cases, but the business units are also allowed a significant degree of freedom through self-service capabilities. The CoE must truly be empowered by corporate leadership to serve in this governance capacity, and they also need the right approach and mindset to build a grass-roots movement of collaboration and self-service in addition to the top-down mandate.
  4. Fully centralized - The corporate center takes direct responsibility for identifying, prioritizing, and implementing all initiatives. In this case, business units are still involved but primarily to consult as subject matter experts on the data they use and their business processes. This approach allows the company to impose the highest level of governance, but it can be difficult and expensive to build a single team large enough to satisfy all of the data needs without becoming a bottleneck. Caution must also be taken to ensure that pursuing a "single source of truth" does not overshadow the actual, real-world needs of the business units. The program should always allow for some degree of governed flexibility.
The Center of Excellence approach is commonly agreed upon as the most flexible and advantageous approach with the fewest limitations. It is also statistically the most successful approach when implementing a new platform from scratch as it provides a small base of knowledge and skills while allowing for flexibility within the business units to execute according to their specific needs as skill sets are acquired and new tools and techniques adopted. This doesn't mean that the CoE is right for every company, but I tend to start there and ask, "why not?" when making the choice. One primary consideration is that the CoE approach requires business units to self-serve versus the fully centralized approach. It is important to consider if your company can make the commitment to place or train the right colleagues to serve these roles within the different business units.

In part 4 of the series, we will perform a proof-of-concept...

Tuesday, October 15, 2019

Creating a Successful Data and Analytics Program - Spell out your ambitions (Part 2 of 6)

This is part 2 in a 6-part series about creating a successful Data and Analytics program. To read part 1, please follow the link below.
And now on to part 2...

Step 1:  Spell out your ambitions

What will Data and Analytics provide that we cannot already do? What do we expect the ROI be in terms of cost/time savings or additional revenue? Where do we stand compared to our rivals? Will Data and Analytics provide us a strategic advantage in the marketplace or is it a requirement just to stay relevant? These questions and more are critical to consider as you embark on the Data and Analytics journey, and they need to be answered to an extent that gains significant buy-in from senior leaders. Stepping back a bit, you are, at the very least, looking to understand why you are going on this journey, what success looks like, and how you will measure your progress towards success. This applies to all significant investments a company makes, but becoming data-driven is typically such a difficult undertaking (and often expensive) that having this framework in place is even more vital.

Note:  Upper management support is not simply about money and prioritization. They, too, need to adopt Data and Analytics as part of their culture. If they expect a Manager or Director to run their operations according to specific KPIs, then the senior leadership needs to pay attention to those same KPIs and use them to review performance. If they expect others to accept and react to information gleaned from data, they have to be open to that as well even if the information runs contrary to long-held beliefs. If they want data literacy to expand beyond the Data and Analytics team, then they need to become data literate themselves and ensure that all teams are playing their role in the cultural evolution. HR, in particular, can be key to this evolution. At any medium to large company, it is not possible for a single team to achieve all of the change that is required. Significant support is necessary from leadership in all areas.

Along with answering questions like those posed above, a decision must be made on priorities. There are essentially four areas where Data and Analytics can be utilized:
  • Improve existing products and service offerings
  • Build new products and service offerings
  • Automate and optimize internal processes
  • Transform business models

Done correctly, a business can scale out their Data and Analytics initiative to cover all four bullets, but it is important to start with a narrower focus to avoid becoming overwhelmed and diluting the benefits of the initial undertaking. The fourth bullet is also, typically, a much more significant endeavor than the other three and likely not where you would want to start unless it is a necessity.

In part 3 of the series, we will select our organizational structure...

Creating a Successful Data and Analytics Program - Introduction (Part 1 of 6)

This blog post is the first in a 6-part series that outlines a framework for implementing a Data and Analytics Program. The intention is not to delve into the specific details of execution because there are many options that can work but to spell out the foundation and key principles upon which a successful program is built. The information presented is an amalgam of details culled from multiple years of research, experience, and conversations with others who have undertaken such an endeavor. The guidelines and advice are based upon industry standards and best practices along with lessons learned and the latest emerging technologies and techniques.

Introduction

The first thing to recognize and accept is that Data and Analytics is not an IT or even a technology initiative. It is a business initiative that requires technical know-how and support. The concept of Data and Analytics needs to be deeply embedded in the organization at all levels so that information and insights are actively sought, shared, and acted upon. Focusing solely on the technologies involved and centralizing the initiative within IT without significant collaboration are two very common reasons given for why Data and Analytics programs fail. That said, this 6-part series is not going to be yet another "Top 10 Reasons Why Things Fail" story. Instead, we will look at how a company can implement a successful Data and Analytics program and the roles that various colleagues play.

In part 2 of the series, we will start by spelling out our ambitions...

Back to Blogging

After working for the last several years either in a start-up with 100 hour work weeks or for a couple of companies with significant restrictions on their employees' public image and information sharing, I am once again in a position to get back to blogging. Though I know this blog isn't particularly popular, it is a great creative outlet for myself and allows me to really align my thoughts. If something I write can also help at least one other person in their career or with a personal technical challenge, then I am very happy for it.

Thursday, February 18, 2010

On The Importance of Code Reviews

From my "I wrote this a quite a while ago and never got around to posting it" collection...

I recently reviewed some code that contained a number of shared, global variables. Basically, a shared variable is re-used by all callers to the particular program in which it is defined. If one caller sets the value to “foo”, every other caller to that program will see the value “foo”. You could think of it as a chunk of shared memory that can be accessed by any instance of the program in which the variable is defined. A real world analogy might be an office bulletin board where every worker has access to read and post important news. That works fine if only one or two people are involved in changing the items posted to the bulletin board at any one time, but what if 100, 1000, or 10,000 people all tried to update the posted news items at the same time?

The shared variable feature of VB.NET can be used to great advantage because it allows us to perform long-running tasks only once and share the results across every caller. Thus, all but the very first execution will run much faster than they would if the long-running task had to be performed every time. That said, we also need to be very careful when implementing this technique (search for SyncLock for more information).

In the code I reviewed, every execution of the program was changing the values of the various shared, global variables, and this introduces a major issue. The problem is that every caller has the potential of overwriting the data of every other caller at points in time when this will cause errors or erroneous information to be returned. Due to the way applications are typically tested prior to reaching a production environment (very few users performing any given task), it is unlikely that such an issue would be found prior to it impacting end users. That is, unless you hold a code review attended by people that can find such a flaw and help come up with a solution. If the code in question had not been reviewed, I can say without a trace of exaggeration that the result would have been weeks to months of customer complaints about strange, intermittent errors. Numerous hours and dollars could have been wasted trying to track down and fix the problem. Instead, the problem was resolved prior to ever making it outside the development environment. This is just one reason why code reviews, when done properly and involving the appropriate participants, are an invaluable part of the development process. Every project should allow time in the schedule to both hold and react to the comments from code reviews.

It is important to note that a code review does not always have to be what one might typically envision (multiple people locked in a room for 60-90 minutes). The idea is just that every piece of code we write has been seen by at least one more set of knowledgeable eyes prior to being thrust upon the users. The number of reviewers should scale up for code that is more critical, complicated, or involves the security of a user's account and/or private information.

Monday, August 10, 2009

How to Get Your iPhone Photos to Sort Chronologically

Please note that I am using iTunes running on Windows Vista to sync photos onto my iPhone 3GS. Just a heads-up. Most notes will still apply to other setups.

I don't know about the rest of you, but I prefer all my photos to be sorted in chronological order. This is particularly true when it comes to my iPhone. I don't usually sit and watch a slide show or even flip through multiple pictures. What I often do, though, is think of a picture I want to show somebody and scan through my photo library (several hundred photos) until I find it. Having every picture sorted by the date it was taken makes this task much easier. I won't go into the intricacies of how iTunes decides to order your photos when putting them on the phone. I've done hours of experimenting, and the easiest thing to say is that it tends to sort of try to keep them in whatever order you sorted the folder the last time you looked at it in your Windows folder browser. I'm being intentionally vague here because that seems to be howApple decided to implement this feature. Whatever it really does (random number generation, contacting Steve Jobs behind the scenes for input, etc.) makes no sense at all, but it must be some strange interaction between iTunes and Vista.

Anyways, the solution I use is the following. Since I organize my photos with Google Picasa, I simply add all the pictures I want to put on my iPhone to an album for that purpose. Then, I select and export them to a single folder. During this process, I opt to shrink the photos down to a smaller size to conserve space. Regardless, these steps aren't that important. You just need to get all the photos you want on your phone into a single folder. Then, I rename the files with the date and time they were taken so that, when sorted by name (Windows default), they are in chronological order.

Of course, I don't sit there for hours renaming. There wouldn't be much point in blogging about that. I found a free program called Siren that can do it for me. Siren has many, many features and can be a little confusing to use at times, but I'll give you a simple command-line that you can execute. Let's assume that your photos are all .jpg and reside in a folder called "C:\Temp\iPhone Photos". Note that the folder should contain a COPY of your photos and NOT THE ORIGINALS just in case something goes wrong. You can run the following from the command line:

"C:\Program Files\Siren\Siren.exe" /D "C:\Temp\iPhone Photos" /N /E "%%Xdo.jpg" /S "*.jpg" /R /Q

It looks complicated, but all it does is execute Siren on the contents of the folder with a few key options enabled. Each .jpg file within will be renamed with the date and time (24-hour) the photo was taken. For example, a photo take August 2nd, 2009 at 8:41:46 AM would become "20090802_084146.jpg". Note that the date is in kind of a funky order (year, month, day) depending upon what you're used to. This is just so that Windows and iTunes/iPhone will sort it properly. That's all there is to it. You now have a folder of photos named in chronological order just waiting to be synchronized to your iPhone. You can add more photos later and run the program again.

PS - In the off chance that you somehow have two photos taken at the exact same second, I would just use a free program like Exifer to change the date/time metadata of one of the photos to be a second earlier or later.

Friday, January 23, 2009

Woodworking

Two posts on the same day! I just wanted to share some of my woodworking project photos available on Flickr (http://www.flickr.com/photos/cmaterick/sets/72157603138071980/). For whatever reasons, I just never adjusted the security settings to open these to the public as all my photos are limited to friends and family viewing. I also have a couple of projects in the works. So, check back in if you're interested.

This is the home theater I built in my previous house a few years back.

Theater Final by cmaterick.

Craig