Ordered, churn, CB consistencies, production costs, margins: almost everything in companies depends on the numbers which are increasingly blurred due to the size they have reached.
The numbers, if well interpreted, analyzed and studied, guide, through planning, to the objectives that companies define with their teams to achieve shared results and to guarantee the well-being of the entire company population.
Numbers create value in society and in relationships with customers, revealing the soul and passion that lie within each project
Here, “per hour”, each of the “15 minutes of internal systems alignment” represents the ideal slot to think about how to extract that something more from the numbers, especially for those who, like me, like to work from the field.
But how can we reconcile the indispensable face-to-face meeting, the very essence of the relationship of trust between customers and suppliers, an elective source of ideas and projects, with the possibility of also exploit all the potential of artificial intelligence?
And again, how to reconcile this new normal or new era of agile working that began with digitalization and is thus permeated by the “dematerialization” of #PhygitalWorkingFromEverywhere thanks to remote connection?
In the era of Big Data, new theories/tech accompanied by very powerful tools, such as Machine Learning and Artificial Intelligence, constitute an opportunity for growth to be able to increase the potential of those numbers, which are looked at so much in the final analysis, in order to draw up sustainable budgets but which, even more, have something to say if looked at from perspectives that escape human processing capacity and which aim to anticipate “how it will be” rather than analyzing “how it was”.
In the last 5 years it has been possible to delve deeper and study with passion, even without an exquisitely IT-technical preparation, these new technologies which are capable of giving us so much but which need to be supported and applied correctly to a set of source data and certain sources.
The current size of the data available must lead us to reflect on the awareness that these “zillions” of bytes/sources available in any company now escape human analysis due to the number and time available to study them. If you then consider that 90% of the total data existing in the world has been generated in the last 2 years, you understand the scope and need for the support that is about to become quantum from computer science.
It is therefore already possible today, with limited basic knowledge but with a solid background in one’s specialization (be it analysis, strategy, sales, management control, purchasing, administration, etc etc), to create dashboard and even better, tools and views that generate reports, analyses, comparisons, simulations that overcome the deep-rooted belief of having to access expensive software for managing the various work tasks.
It is possible to have an extensive and overall, orderly and proactive vision of all the sources available in the company
Having arrived this far, I immediately clear the field of an illusion: simplicity, speed, low costs?! It is certainly not all that simple, indeed success with artificial intelligence, just as happens in work, is achieved by paying attention to the rationalization and normalization of food sources. As Steve Job put it: “Simplicity can be more difficult than something complex”
It is therefore clear that the preparatory work and development of these new technologies has significant costs, both economic and even more of human ingenuity, to bring to light, streamline and rationalize the sources accessible to the multitude of specialized and necessary software that we want to reveal.
The first very current example of booster comes to us from the advent of the first feared and now regulated, new era of generative Artificial Intelligence (ChatGPT so to speak and similar) which allows further accelerations for the development and evolutionary processes underway and makes it possible to open up scenarios that were unthinkable until a few months ago.
Generative? Open, “no code” just Ask and Generate. Already, as indicated above, we are entering the era of Proactive and Prescriptive Intelligence at our fingertips!
The challenge to be accepted, already glimpsing the first results, goes precisely in this direction, exponentially scales even the most rosy expectations and allows you to set strategies, carry out analyses, work together with colleagues, partners and customers, from anywhere with any device .
In real time it makes entire sets of information available, making it possible to generate new models and new work tools with instant data processing speed and result quality, but also and above all by combining the data with the human creativity of the people/professionals with whom they work interacts, allows you to exponentially increase the potential in place thanks to the human factor, a truly indispensable enhancer.
It is no longer a question of exploiting our intellectual creativity, which should or can never be lacking, but of combining it with powerful tools, strengthening the validity of the numbers that are processed every day in companies and which are the key support for decisions.
The natural outcome of these factors leads to 2 relevant results/consequences:
- Quantitative result: processing speed of a huge amount of constantly updated information
- Qualitative result: interpretation and ordered proposition of a greater number of information/views that the generative AI makes readable thanks to self-learning and this allows us to have results suggested directly by the AI in the face of unexpected intersections for the human mind in terms of quantity and speed of exposure
points (1+2) = SIMPLICITY (Quality + Quantity) ∞
These tools have the consequence and objective of simplifying professional and working life by making decisions accessible and proposing solutions (they open up scenarios in the presence or absence of collaboration with customers) which only a few years ago would have required team working and development with considerable effort and months of intense work. Not only that, they free up time to dedicate to work and, why not, to private life!
Right! I forgot… is the secret to success in output still to be completely solved?
Any Data Scientist will respond with the wisdom of experience: “90% of success is already within any company and lies in the goodness of the feeding sources” which however, being a “zillion” of bytes, are not always easy to find and not often of immediate integration given the complexity and articulation in the various sources.
Last point but first in terms of relevance to the sources: it is true that artificial intelligence processes Big Data but with what ethical standard does it return the output?
The ethical implications regarding the use of AI represent the great challenge in the field but above all the greatest opportunity for evolution in work.
Numerous other in-depth articles would deserve and it is naive to think of exhausting them in a few words, but in the meantime: all analyzes are carried out in full compliance with current regulations on privacy, according to current company policies and often contribute to updating them for speed and the evolution of matter.
In short, to put it in the “movie disclaimer”: no one’s data was mistreated during the ongoing production of this tool!