I looked at these older GUIs & IDEs, and they no longer feel like expert systems. They only feel good for games, and peer-pressure from massive programmers pushes us heavily back into that crib (and start school all over). If I do go back, succession turns into redundant replication rather than forward progress. The command line and shell interface always allows forward-thinking even if it is simply to document progress before system failure.
When all else fails, we layout the graph paper and chart computations. That is quite an ambiguous array of information, but it made sense when we project three-dimensional constructs and plot them into the square cells of two-dimensional aligned media. You learned compression when you counted one row of filled cells, like bits, and summed them into another numerical column. You learned decompression when you read the numerical column and plotted bits in each row. You notice the spatial relationship of one column versus all these filled rows. That is like banking, and it resolved the many errors of common sense down to missing bits.
Also note that managers argued over plotted graphs in comparison with demands for exact functionality despite that the rules of graph paper, even simulated, is quite ambiguous beyond its media. Lets imagine there are already many printed books that further resolved all those arguments, so there is the redundancy factor. That would be another succession if everybody knew indexes to every factor, function, or book. Everybody has their own interest, however. With language, we can spell out numeric interest (in C#):
const Decimal zero = Decimal.Zero; const Decimal one = Decimal.One; const Decimal two = one + one; const Decimal five = two + two + one; const Decimal ten = five + five; const Decimal hundred = ten * ten; const Decimal hundredths = one / hundred;