My Talk at the RAEng Fellows Day at Newcastle

I was invited to give a talk on my Research at the Royal Academy of Engineering event, held in Newcastle on the 28th January 2019.

The title of the talk is “Asynchronous Design Research or Building Little Clockless Universes

The PDF of the slides of my talk are here: http://async.org.uk/presentations/AlexYakovlev-Research-RAEngEvent-280119.pdf

I only had 15 minutes give to me. Not a lot to talk about the 40 years of research life. So, at some point in preparing for this talk, I decided that I’ll try to explain what the research in microelectronic systems design is about, and in particular how my research in asynchronous design helps it.

Basically, I tried to emphasize on the role of ‘time control’ in designing ‘little universes’, where the time span covered by our knowledge of what’s is going on in those systems and why is between 1 few picoseconds (transistor switching event) and hours if not days (applications life times). So we cover around 10^18 events. How does it compare to the life of universe – being “only” around 10^13 years. Are we as powerful as gods in creating our ‘little universes’.

So, in my research I want to better control TIME at the smallest possible scale, surprisingly but, by going CLOCK-LESS! Clocking creates an illusory notion of determinacy in tracking events and their causal-relationship. Actually, it obscures such information. Instead by doing your circuit design in a disciplined way, such as speed-independent circuit design, you can control timing of events down to the best levels of granularity. In my research I achieved that level of granularity for TIME. It took me some 40 years!

But, furthermore, more recently, say in the last 10 years, I have managed to learn pretty well how to manage power and energy also to that smallest possible level, and actually make sure that energy consumption is known to the level of events controlled in a causal way. Energy/power-modulated computing, and its particular form of power-proportional computing, is the way for that. We can really keep track of where energy goes down to the level of a few femto-Joules. Indeed if a parasitic capacitance of an inverter output in modern CMOS technology is around 10fF and we switch it at Vdd=1V, we are talking about minimum energy quantity of CV^2=10fJ= 10^-14J per charging/discharging cycle (0-1-0 in terms of logic levels). Mobile phones run applications that can consume energy at the level of 10^4J. Again, like with time we seem to be pretty well informed about what’s going on in terms of energy covering 10^18 events! Probably, I’ll just need another 5 or so years to conquer determinacy in energy and power terms – our work on Real-Power Computing is in this direction.

Now, what’s next, you might ask? what other granularification, distribution and decentralization can we conquer in terms of building little universes!? The immediate guess that comes to my mind is the distribution (in time and energy directions) of functionality, and to be more precise intelligence. Can we create the granules of intelligence at the smallest possible scale, and cover same orders of magnitude. It is a hard task. Certainly, for CMOS technology it would be really difficult to imagine that we can force something like a small collection of transistors dynamically learn and optimize its functionality. But there are ways of going pretty close to that. One of them seems to be the direction of learning automata. Read about Tsetlin automata, for example (https://en.wikipedia.org/wiki/Tsetlin_machine) , in the recent work of Ole-Christoffer Granmo.

 

 

 

 

On the Role of Mathematics for humanity in building physical reality

Mathematics is a (or, probably, the only!) language that enables ideas about physics be communicated between people across different generations and across different cultures.

Inevitably, it ”suffers” from approximation and abstraction compared to physical reality. A bit like an impressionist painting reflects the real picture.

The question is what and how much is sacrificed here.

One test of whether the sacrifice is acceptable or not is in the way how people, while using mathematics, can build physical objects such as airplanes, cars, bridges, radios, computers etc. If they can and at a reasonable cost, then the language is adequate to the purpose.

For example, it seems that the mathematical language of Heaviside’s operational calculus is sufficient for the purposes of designing and analysing electrical circuits of good quality and in an acceptable time.

Another example, the language of Boolean algebra is sufficient to design logic circuits if we clock them safely so that they don’t produce any hazards. If, however we don’t clock them safely, we need other ways to describe causal relationships between events, such as Signal Transition Graphs.

 

 

My PhD Thesis (1982) – scanned copy in pdf

I have finally managed to scan my PhD thesis “Design and Implementation of Asynchronous Communication Protocols in Systems Interfaces” in Russian (“Проектирование и реализация протоколов асинхронного обмена информацией в межмодульном интерфейсе”)

The thesis is spread between several files (total – 255 pages):

Title, Contents and Introduction:

Chapter 1 (General characterization of the methods of formal synthesis and analysis of communication protocols): 

Chapter 2 (Formalization of the behaviour of interacting objects and communication protocols):

Chapter 3 (Interpretation of asynchronous processes and use of interpreted models for the description and analysis of protocols):

Chapter 4 (Organization of aperiodic interface of intermodular communication):

Conclusion and References:

Appendinces (1-5):

(1) Example of context procedure

(2) Example of controlled protocol

(3) Application of Petri nets to specification of asynchronous discrete structures

(4) Information transfer on three-state lines

(5) Analysis and implementation of the TRIMOSBUS interface

Exploitation confirmation letter from Ufa plant

 

Bridging Async and Analog at ASYNC 2018 and FAC 2018 in Vienna

I attended ASYNC 2018 and FAC 2018 in Vienna in May. It was the first time these two event were collocated back to back, with FAC (Frontiers of Analog CAD) to follow ASYNC.

See http://www.async2018.wien/

I gave an invited ‘bridging’ keynote “Async-Analog: Happy Cross-talking?”.

Here are the slides in pdf:

https://www.staff.ncl.ac.uk/alex.yakovlev/home.formal/talks/ASYNC18-FAC18-keynote-AY-last.pdf

 

 

 

Energy-vector, momentum, causality, Energy-scalar …

Some more interesting discussions with Ed Dellian has resulted in this ‘summary’, made in context with my current level of understanding of Catt Theory of electromagnetism):

  1. Energy current (E-vector) causes momentum p.
  2. Causality is made via the proportionality coefficient c (speed of energy current)
  3. Momentum p is what mediates between E-vector and changes in the matter.
  4. Momentum p is preserved as energy current hits the matter.
  5. Momentum in the matter presents another form of energy (E-scalar).
  6. E-scalar characterises the elements of the matter as they move with a (material) velocity.
  7. As elements of the matter move they cause changes in Energy current (E-vector) and this forms a fundamental feedback mechanism (which is recursive/fractal …).

Telling this in terms of EM theory and electricity:

  • E-vector (Poynting vector aka Heaviside signal) causes E-scalar (electric current in the matter).
  • This causality between E-vector and E-scalar is mediated by momentum p causing the motion of charges.
  • The motion of charges with material velocity causes changes in E-vector, i.e. the feedback effect mentioned above (e.g. self-induction)

I’d be most grateful if someone refutes these items and bullets.

I also recommend to read my blog (from 2014) on discretisation

On Quantisation and Discretisation of Electromagnetic Effects in Nature

Real Nature’s proportionality is geometric: Newton’s causality

I recently enjoyed e-mail exchanges with Ed Dellian.

Ed is one of the very few modern philosophers and science historians who read Newton’s Principia in original (and produced his own translation of Principia to German – published in 1988).

Ed’s position is that the real physical (Nature’s) laws reflect cause and effect in the form of geometric proportionality. The most fundamental being E/p=c, where E is energy, p is momentum and c is velocity – a proportionality coefficient, i.e. a constant associated with space over time.  This view is in line with the Poynting vector understanding of electromagnetism, also accepted by Heaviside in his notion of ‘energy current’. It even is the basis of Einstein’s E/mc = c.

The diversion from geometric proportionality towards arithmetic proportionality was due to Leibniz and his principle of “causa aequat effectum“. According to Ed (I am quoting him here)  – “it is a principle that has nothing to do with reality, since it implies “instantanity” of interaction, that is, interaction independently of “real space” and “real time”, conflicting with the age-old natural experience expressed by Galileo that “nothing happens but in space and time” “. It is therefore important to see how Maxwellian electromagnetism is seen by scholars. For example, Faraday’s law states an equivalence of EMF and the rate of change of magnetic flux – it is not a geometric proportion, hence it is not causal!

My view, which is based on my experience with electronic circuits and my understanding of causality between and energy and information transfer (state-changes), where energy is cause and information transfer is effect, is in agreement with geometric proportionality. Energy causes state-transitions in space-time. This is what I call energy-modulated computing. It is challenging to refine this proportionality in every real problem case!

If you want to know more about Ed Dellian’s views, I recommend visiting his site http://www.neutonus-reformatus.de  which contains several interesting papers.