Prasad Rampalli
9/5/2016
The notion of simplifying the IT environment for optimized TCO
and Agility is not new. For many of us involved in designing or running IT,
complexity has come in many avatars over the past three decades.
My first experience with this was in 1987 – early 90’s while
deploying Manufacturing Execution Systems (MES) in wafer fabrication facilities
at a leading semiconductor company.
Our big challenge was to deal with the complexity of
configuring and ensuring timely change control of the manufacturing process. What made this daunting was the combinatorial
effects of a given Process with 10+ Routes x 100+ Process steps/route x 100+
equipment x 1000’s of Statistical process control and engineering parameters x 100000’s
of Lots... am sure you get the picture.
We had an army of Shop floor analysts dedicated to keeping
up with these changes and ensure the overall system was reflecting what was
really going on in the shop floor at all times. However, we found ourselves to
be slow ( & costly) to respond to changes needed and the quality of the
data from the shop floor system as a result was never a 100%.
To solve this problem we felt we needed to eliminate the
shop floor systems analyst as the “middle-person” and have the process
engineers or planners in the Fab directly implement these changes into the
system. This was not an easy task as the UI or management/ease of making these
changes needed a skilled analyst who understood the intricacies of the shop
floor modules and their data relationships
to ensure changes made were validated for any downstream effects on logic and
accuracy.
After some brainstorming with my team and the peer industry
network (AMD, Harris, TI, National ..) we felt the ideal solution was to create
a standard “declarative format” in a business friendly language to front-end
all our shop floor configuration. We
called it “Rules or Spec driven enterprise”.
The idea was to make the fab process spec the “master” and embed an
active declarative language directly into the online spec. We would then “push button” the changes in
the real system once the process changes in the on-line spec were approved by
the engineering or manufacturing lead in the Fab. In essence we would “mask” the inherent
complexity of the system from the end-user (the process engineer or planner in
this case) by codifying the shop-floor rules as an integral part of the process
spec.
Think of it as the “dev-ops” for running the shop floor back
in the 1990’s!
Did it work? Not
really. We didn’t have the technology maturity in our modeling tools and
deployment architecture to make this a reality (not to mention
organization/business transformation issues – a whole different topic). The
notion of a UML (universal modeling language), addressing the interoperability
challenges along with standard semantics for the shop floor in semi-conductor
manufacturing just had too many hurdles to overcome in 1990.
Since then – I have seen this need for solving complexity
come up many times at different levels in the solution stack in my 30+ years in
the industry. Here is a smattering (not meant as an all inclusive list)-
- . The roll out of packaged ERP systems in the late 90’s brought about the need for enterprise integration as a key architectural focus. EAI (enterprise application integration) technologies with message bus and spoke-hub architectures became the panacea for solving our “spaghetti point to point” complexity of applications.
- We invested millions of dollars in corralling data management and one version of “truth” for master data with Enterprise Data Management tools, ETLs and scalable (albeit proprietary) Data Warehouses
- Virtualization led to VM sprawl and the need for operations automation became paramount. “Infrastructure as Code” was born with SOI (service oriented infrastructure) and IDEs’ (integrated development environments) to address IT Agility.
As a glass half full view – I think we made progress in each
phase.
Fast forward to the Cloud era – complexity continues to be
the # 1 challenge (besides security of course!) with the current transition
underway. In the 90’s we were dealing with spaghetti “mess” inside the four
walls of the enterprise; now the same problem has oozed in Hybrid or Cross
Cloud deployment architectures as It is a given that a F1500 IT shop will
typically have multiple public clouds and its own private cloud as a setup in
the coming decade (VMW 2016 key theme).
The good news (unlike circa 1990) – the entire
infrastructure, OS and application stack can now truly be represented in
software – thanks to virtualization of compute, storage and now the network and
containerization. The ability to create
a “model driven cross-cloud architecture” and implement this as a real time DevOps model
in rich, user friendly declarative formats is sorely needed to accelerate this
transition. There are many players
jumping into this space, each from their position of strength and it will be
interesting to see the emerging winners.
More on this on my next Blog.
Prasad
=