Home Up Cost of Knowledge Filtering Knowledge Knowledge Puzzles

 

 

A Practical Dialogue Concerning 

Unified Granular Theory

By Roy D. Follendore III

Copyright (c) 1999 RDFollendoreIII

All Rights Reserved

 

Granularity is a simple concept that gets very complex, very quickly. If you were to pick up a book on Object Oriented Analysis you would see a similar level of complexity occurring.  There is a reason for this.  Content granularity has it's roots there.  This is a story and like all stories it has a beginning, middle and end. I will not attempt to express every issue, but rather the fundamental ideas.  As you shall see as you reread this discussion this in itself is a recursive ideal in that for the sake of clarity there is content being withheld.  So rather than responding to all of your questions about the relationship of granularity and technological implementation I suggest that we start out at the beginning. 

Plato invented the idea of Objects thousands of years ago.

Plato essentially said that the Universe is made up of "things" we call "objects" and that objects can contain objects.  In talking about the Universe in this way, we are generalizing it, in exactly the same manner that we generalize the concept of things called content.  Things can be anything.  Content can be anything.  The difference is that if we were to say that we were to take a thing called an elephant and put that thing within an eye of a needle, we would know we would be talking metaphorically about things because the physical universe prevents us from doing that.  The concept of content also has two ways of being considered. There exists the concept of a statement of fact and there is the symbolic metaphor.  The physical universe dictates what can and can not occur as a fact of reality.  Our rationality dictates what can and can not occur symbolically.  I bring all of this up because if you want to understand the concept of content granularity as I am speaking it and I need to be certain you and I are talking about the right things.  There is that thing that enables content granularity and then there is the content granularity itself.  

The words that we are using fail us when we get into the levels of abstractions that are necessary and this is the reason why I have been doing my best to define a new language.  The word "granular" is of such a concern because it actually implies something being courser or having larger aggregations.  In day to day life, some people use it in terms of making things larger and others use it in terms of making things smaller.  "Fine Grain" appears to be a more consistent and more accurate way of talking about the concept of granularity if we are to be referring to smaller grain.  Granularity is therefore about the comparison of sizes, and does not refer to a class of sizes.

 

I hope that you understand this because if you don't you will be confused.

 

When we take a symbolic concept and put it within another concept as we do in writing for instance, some interesting things happen.  They become an aggregation.  The two things become one larger concept than either would be alone.  There are too few letters and too few degrees of freedom for letters to be of much use to us as metaphorical symbols.  We use them to "code" symbols in the same way that sounds in Morris code can be used to code letters and in turn the same way that ASCII standards code letters.  Alphanumeric symbols and characters are two different things.  Going back to what I previously discussed, a symbol is a metaphorical idea and a character is a physical idea. I am sitting here looking at characters that will not be transferred around our planet to you.  Instead will be sent are symbols that hopefully will be converted back into characters at your end. The letters within this message are symbols that are in and of themselves useless, and for the most part meaningless, however they are aggregated to make words which are more useful.  Words stand for things and because of that they are the first level at which metaphorical ideas can truly be expressed..         

The second thing that must be kept straight is that an idea remains alive in the mind, not within symbols or characters. There are therefore these objects which we manipulate and there is the process which is ultimately the actual object which we are attempting to manipulate.  Here in Northern Virginia, by typing this communication, I am therefore not attempting to manipulate these characters, I am attempting to manipulate your brain. If we lose track of this concept then we lose completion. 

The first person principle in which your mind is operating at the moment you will read this involves sensing that which is within your range of senses and connecting what is perceived with the state of what you feel and know at the next moment so that at a second order you anticipate and have empathy.  When we choose to write with sophistication we do so with the understanding that what we say will be translated through this first order and to this second order of empathy.  It is a transaction that we anticipate as authors but have forgotten we have learned.    

Let me give you a simple symbolic visualization of the technical concept as to how this concept of granularity with perspective in physical space works.  Consider the idea that a particular character, let us say the letter "p" is a part of the content of the alphabet.  It represents a fine grain, what I have termed a, "particle", of the alphabet.  If you were to present this alphabet from one perspective it would look like "abcdefghijklmnopqrstuvwxyz".  Yes, the letter "p" is right there were we assume it would be between the letters "o" and "q" in accordance with standards of alphabetic character cardinality.    If you were to turn this alphabetical symbolic character set on it's axis you would see "a" or you would see "z".  Therefore "aaaaaaaaaaaaaaaaaaaaaaaaaa" or "zzzzzzzzzzzzzzzzzzzzzzzzzz" are conceptual arrays of alphabets just as is "abcdefghijklmnopqrstuvwxyz".  Consider that the computer screen you are probably looking at is drawing and redrawing every character millions of times (but the symbol only once) and any character on this screen has some potential of being replaced by any other character if the symbol were to change. 

The point to this exercise is that we could either choose to present the symbol "p" and ignore every letter to the left or right of our horizontal alphabet, or choose to present the symbol "p" by ignoring everything before it.  In either case, the symbol "p" is presented as the correct character, but when you consider it, the result is not actually the same because the perspective is altered.  In the first case, all of the possible combinations of character-symbols are available or inspection, and in the second case they are not.  We do not actually know if there is a cardinality of symbolic alphabetical order beyond our chosen letter in the second case.  Moreover, if we did not inspect the cardinality of order prior to our character, and simply asked for the 16th character because we expect "p" to be there, we would not know if cardinality except for "p" existed at all.  In that case we might assume some higher degree of probability that some degree of cardinal order exists within our alphabet but we certainly could not know it to be true.      

I hope that at this point you understand that this is an original and fundamental concept founded squarely upon the theoretical philosophies of cryptography.  Within the concept of cryptography every character symbol is potentially interchangeable. There are many possible arrangements of the alphabet. In fact because context is involved there are n dimensions and therefore infinite possible arrangements.  As a cryptographer, when I think of an alphanumeric symbol represented as a character I see an infinite array with infinite potentials of existence, not a single character. 

From the philosophy of a Cryptocommunicator I see the same opportunities but with words, images and image fragments, data, information, knowledge and ultimately wisdom.  Taken to the ultimate conclusion, in a very real physical sense, all of the possible knowledge of the Universe is therefore locked between a single character and entropy.  Man has chosen a linear horizontal array of characters to convey and present knowledge as our means of communicating, because of no technical means to do otherwise. Because of advances in technology, this situation has changed. I invented Virtual Private Networks by asking the question, "If I were to have invented cryptography, knowing what I know today, what would I have done differently?"  My research over the years has taught me a few things, so now I am asking the question, "If I were to have invented communicated symbology differently knowing what I know now, what would I have done differently?"

Managing granularity is ultimately about taking human communication past what exists as characters or words and manipulating ideas, in the form of data, information and knowledge in context with first person principles rather than being satisfied with second order principles.  It means assuming control and responsibility of authorship within the total communication process.  It also means taking control and managing of the potential of the unknown rather than representing the unknown as single absolute, which it is not.  Isolated absolute truth can not exist.  If communication exists to present truth then we must accept the physical fact that individual truth is contextual and is based on the ultimate relative perspective of the human mind and perception.  Human beings are complex and do not share the same mind and perception.  This means that if we were to want to convey the same truth to an individual person and then accurately convey truth to another, there is a need change the context of content.  We must be able to communicate appropriately, introducing that which is necessary and leaving out that which in other contexts would be unnecessary. The general concept of content granularity implies content control and that in turn implies the management of what is there and what is not there.  It also implies some means of controlling what is and is not to be presented. 

It is one thing to think in terms of stuffing symbols and characters together and another thing to intelligently manage and distribute those constructions.  In the 1980's the concept of AI was thought to revolutionize humanity and it has but not in the ways that we have anticipated.  HTML and languages are like the AI languages of Prolog or Lisp. To make them useful you need to architect a useful system from them and then you have to embed that system into a concept that is larger.  It is one thing to use language toys to hand craft useful knowledge to an individual but another thing to craft tailored consistent knowledge to a mass of individuals across cultures. 

If you have noticed, not once in this rather lengthy dialogue have I mentioned the word Premonition.   Premonition is important in that for the first time it combines cryptography and artificial intelligence at its core as the means by which to achieve the level of control that is ultimately necessary to manage large amounts of granularity and entropy in a rational way. This is the only way known to assure that such complex content granularity is presented appropriately.

  • Cryptography is the only known way to authenticate and secure transactions in the digital world.  It creates and manages n dimensions of what can be inside and outside granularity that would not otherwise exist within the concept of physical granularity.  Furthermore, the act of hiding and revealing data, information and knowledge is such a transaction from these dimensions. 
  • AI is the only way known to control and make rational sense from very complex, non cardinal, and inferential arrangements and permutations of transactions using computers. 

The engineering of intimately fusing AI and Cryptography into a single general purpose system turns out to be an orchestral design activity that is not trivial.     

Premonition is essentially an efficient and secure label filtered content control and search process for communication and creation of knowledge and that has applications far beyond the concept of granularity.  Since communication is what Computer and Information Science is ultimately about, Premonition is not only important for the Internet but for our relationship with the nature of computers as a whole.  The processes that are inherent within Premonition change the way that computers and the Internet can be operate and that in turn will change what they can do and how we respond to them and they to us.  

It is my profound hope that this essay has done more to clear up conceptual roadblocks to understanding than create new ones. I believe that there is much to communicate and to further explore.  Problems are pebbles on bare feet that do not need clear definition unless you must make contact with them.  I feel that one can choose to step on them or step over them along the path.  I have learned that it is far easier to understand why a pebble was, once it is no longer an obstacle.  The way that I choose to work is to redefine what is possible through discovery of coherent philosophy and invent the means through coherent engineering.  

 

I thank John Magnik in Australia again for his excellent questions as it gives me the opportunity to reflect on the pebbles that were in my path, and in doing so to make my work better understood.  Roy D. Follendore III

   

 

 

.

Copyright (c) 2001-2007 RDFollendoreIII All Rights Reserved