Categories
- Agile (6)
- Architecture (1)
- art (2)
- Big Data (1)
- books (11)
- Cars (1)
- Change (9)
- common sense (3)
- Contracting (1)
- Cool (1)
- Design (5)
- development (1)
- emotions (3)
- entrepreneurship (2)
- google (1)
- great quotes (2)
- happiness (1)
- innovation (3)
- IT (8)
- IT Architecture (3)
- Knowledge Transfer (2)
- leadership (3)
- Lean (1)
- links (1)
- literature (1)
- lyrics (1)
- management (9)
- Media (1)
- Negotiating (1)
- networks (1)
- people (5)
- Personal Improvement (8)
- planning (5)
- productivity (3)
- Programming (1)
- Project Management (23)
- Projects (1)
- psychology (3)
- Requirements (5)
- Risk (2)
- scrum (2)
- strategy (6)
- struggle (1)
- summer (1)
- teams (5)
- Test Automation (3)
- thinking (5)
- Training (2)
- Travel (1)
- waterfall (4)
Tag Cloud
Agile
Architecture
art
Big Data
books
Cars
Change
common sense
Contracting
Cool
Design
development
emotions
entrepreneurship
google
great quotes
happiness
innovation
IT
IT Architecture
Knowledge Transfer
leadership
Lean
links
management
Media
Negotiating
people
Personal Improvement
planning
productivity
Programming
Project Management
Projects
psychology
Requirements
Risk
scrum
strategy
teams
Test Automation
thinking
Training
Travel
waterfall
Collaboration and Enterprise IT
Posted by Rich Crowley in Big Data, IT Architecture, Project Management, strategy
In a keynote speech earlier this year at a Big Data conference in California, Geoffrey Moore talk about how the consumerisation of IT will affect the enterprise. Here are some of this thoughts and implications from my perspective.
He suggested that since the 1970’s, enterprise IT has focused on building / deploying large systems of record. These would be the admin systems, billing systems, finance and HR systems with the requisite building blocks like database management systems underneath. This buildout was somewhat repeated or concentrated around the Y2K timeframe and he likened this multi-decade push as being very similar to the interstate highway system buildout many decades before. However, once the highway was in place, it became simply a means to greater ends.
The consumerisation of IT, which he suggests has been happening since Y2K, is based on three principles: access (google something and get it), broadband and mobile (cool in the developed world, a necessity in the developing world if you want any mass computing capability). These principles have lead to consumerisation and digital engagement on a grand scale. His key conclusion from this consumerisation is that if organizations cannot detect, analyse and respond to this increasing level of engagement, they will suffer and without some form of Big Data capability, they will not be able to do this. Thus, Big Data is the sensory system of a consumerised, web enabled world and enterprise IT’s job is to now give organization’s this sensory capability above and beyond their systems of record, which are very inward focused.
He went on to suggest that building such a sensory system requires collaboration capabilities to be developed / deployed not only within the enterprise, but across enterprises and these simply don’t exist today, or exist in such a weak form that they are prohibitive. Facebook, Twitter, Facetime, Foursquare – all examples of collaboration capabilities that were developed outside of the enterprise, and in large part, don’t exist inside the enterprise. Some of these capabilities exist to some degree (Yammer is a twitter alternative) but by and large, it has been more difficult to deploy these within the enterprise than outside of of it.
These comments really resonated with me. Having worked many years on countless projects in medium to large companies, one of the obstacles I am constantly running into is the difficulty in collaborating with partners external to the one I’m working directly with and in some cases, even with internal stakeholders.
One example is the lack of a common project artifact repository for planning and execution purposes. All too often, each party has their own MS project gantt’s, their own issue and risk lists, their own reporting documents, with lots of overlap. To some degree, this is because each party has their own confidential sub-elements of these but often it’s because there is simply no common technology to leverage or no organizational willingness to use what is available. Cloud capabilities are chipping away at this. Basecamp from 37 Signals and Gantter being good examples of project planning tools that can be used for cross-enterprise projects (regardless of whether Agile or Waterfall approaches are being followed). The same holds true for design, build and testing systems.
I think some of this lack of collaboration capability is also somewhat demographic in nature. The consumerisation movement has been lead as much by young folks with smart phones, game consoles and laptop screens with Facebook always on as anyone else. These are not the same folks who built enterprise systems over the last 30 years. The former is much more comfortable with the notion that there is no privacy so get over it. The latter feels an imperative to protect information, to hold it close. Both sides have arguments that make sense, but as consumerisation continues as a force, those trying to hold on to information will be bowled over. It is a tug of war between respecting the demands of the enterprise with respect to data security, privacy and integrity versus the enhanced user experiences we have come to enjoy via consumerised IT and that consumers will continue to demand in increasing levels of sophistication.
So…where does this leave us? Well, it leaves enterprise IT with a new and growing set of challenges that conflict with some of their existing challenges. It also leads to a bunch of use cases, many which are not yet conceived, where Big Data strategy will be applied in disruptive ways. However, Big Data is quite different than BI in that BI was about educating people with the information contained in data whereas Big Data is on scale so large that educating people won’t help; it will be all about machine to machine learning. It will mean development of new algorithms that can be run in realtime that will feed other machines, other algorithms, etc. Very interesting stuff indeed.