Next NES Event
Summary: Coming soon
Places of Interest
Are you looking for somewhere with an engineering theme to visit? Why not try our Places of Interest map on the Useful Info menu?
Summary: Coming soon
Are you looking for somewhere with an engineering theme to visit? Why not try our Places of Interest map on the Useful Info menu?
Welcome to the Norwich Engineering Society, an active forum dedicated to fostering the exchange of ideas and experiences among all those passionate about engineering — past, present, and future.
For over a century, we have focused on the people behind groundbreaking innovations and their visionary concepts. Our mission is to enlighten, support, and develop our members in Norfolk and beyond. We achieve this through unique engagements, dynamic events, informative seminars, and insightful publications.
By championing the diverse disciplines within engineering, we ensure its continued evolution and relevance in an ever-changing world.
In his introduction, John Pickering who runs a local electronics consultancy company - Metron based at Reepham, stressed the importance of measurement in any successful engineering project and why it was vital for successful engineers to understand all the ramifications of the process.
In the first part of his talk, John took us through the early history of how measurements came to be. An important part was bought about through the process of trade and trying to minimise the apparent natural tendency for some humans to cheat. Another driver was safety; ensuring any construction is fit for purpose. Curiosity was also of significance. To understand what was going on around them humans had to make accurate and precise correlated observations that were communicable to others. This led through to the need for agreed standards and the tools to maintain them.
John pointed out that all measurements are made relative to the standards via instruments that are calibrated relative to these standards. He also made the very important observation that a single measurement is unique and that it is vital, if at all possible, to make several measurements of nominally the same quantity and infer its actual value and precision via statistical averaging.
John then spent a little time on ensuring that the Members had a good understanding of the implications of the terms accuracy and precision. In his working life John hinted that he had come across too many examples of engineers who had made very precise measurements of inaccurate values because of some unidentified systematic shift in the underlying value; sometimes referred to as a systematic error.
John concluded his talk by looking at the idea of a standard how current thinking was to have the basic standards expressed in terms of quantum constants e.g. electronic charge and Plancks constant rather than a directly measurable quantity such as a metre for length and a kilogram for mass of the SI system. John pointed out that the only SI standard that had not yet been practically transferred to a quantum standard was that of electrical current. The problem here being the smallness of the electronic charge, Practical currents have many millions of electrons so counting individual ones is totally impractical!!
Richard Aldridge opened his presentation with a confession that in the process of developing this talk material he had become more confused about his understanding of what time is! Some of this was due to the problem of reconciling the quantum description of matter and that provided by the relativistic view and some of it came about from the use of the word time in everyday conversation.
The only way in which some of this confusion can be resolved is by looking at the way the concept of time has developed over the ages. Adopting this approach, the first part of the talk examined how early man observed the light levels varying due to the motions of the Sun during daytime, the Moon and stars at night and the tools that were used to make the measurements. Richard described early prehistoric monuments e.g. Stonehenge and their use in predicting seasons; of particular importance during the transformation of social groups from the hunter-gatherer life style to the farming mode. Our ancestors also developed various other time pieces that depended on material flows such as waterwheels and hourglasses as well as devices such as constant burn rate candles. Using these devices the basic units of time were defined such as the day, hour, minute and second.
During the Renaissance major developments in glass technology and knowledge from the Islamic world combined to bring much more accurate astronomical measurements. Together with improvements in pendulum clocks Kepler was able describe the motion of the planets around the Sun. This important development enabled Galileo and Newton to introduce the idea of a universal clock. Because of the way they introduced this idea of time it had a built in problem - reversibility of time. This ran counter to human experience of observable time running in one direction only i.e. from past to future. This view was backed up by all the work that was done in improving the design of machines particularly steam driven devices.
Some of the reversibility problem was removed when it was discovered that information could only travel at the speed of light and that whatever frame of reference the measurer was in the speed of light was the same value. Einstein pointed out in his general theory of relativity that because of these constraints there could be no such concept of universal time as conjectured by Newton.
As Richard pointed out in the last part of his talk, the quantum mechanical model (QMM) of matter differs from the Einstein approach (GR) in the way it handles time. It would appear that in the most recent form of QMM, time is an emergent quantity whereas it is a locally defined feature in GR. Only time will tell!!
In his introduction, Dr Michael Nix, explained that he had chosen the 18th century as this was the period when the Norwich textile business was at its height in Europe and further afield. However, before looking at how cloth orders came about and how they were delivered, Michael described how the cloth was manufactured from the raw material through to final bale.
The bulk of Norwich cloth manufacturing was at the top end of the market with wool as the main material source; the final product was loosely known as Worstead cloth. This style was developed in the middle ages at Worstead, a village some 15miles north of Norwich. The interesting features about the yarn produced by the Worstead process was that it was from long fibre wool from sheep farmed in Lincolnshire and Leicestershire with a smaller amount coming from southern Ireland. The first process with this wool is to align the fibres by combing. Once the fibres are aligned they are spun into tightly bound yarn. Most cloth is coloured by the process of dyeing. This can be done either before or after weaving. If the latter, this tends to be when large areas of the finished product is of one colour. Most products were multicoloured. Most of the dyes were produced locally with the ingredients imported from overseas from as far afield as the Americas and the East Indies.
The next stage, once the yarn has been fully prepared, was to weave and finish the cloth to the required standard. Michael went into detail about what looms were used to achieve the ordered patterning and how the final cloth was finished through the process of syngeing and calendaring where heat and pressure was used to achieve the required strength and sheen.
Michael concluded his talk looking at the commercial side of the textile business. He illustrated this with a cloth order for waistcoats particularly popular in Russia at the time. He showed examples of the pattern books that used, how agents obtained the order and the other processes involved supporting the actual production of the ordered cloth. For example, the transport involved and the required financial structures, such as banking and insurance, to ensure that the order was delivered. One of the elements that was obvious throughout the talk were the timescales involved; the Russian order took about a year from start to completion. An important element that Michael stressed, throughout, were the regulations involved in ensuring that the cloth met the required standards and the penalties involved if they were not.
Speaking to a combined live/zoom audience, Paul Harris a leading consultant with Abbey Solutions, opened his talk by setting the context of his presentation; the ground work for 6G communications networks which were expected to be released by about 2030. In particular, he said he would concentrate on how and where artificial intelligence(AI)/machine learning(ML) could be used to in the setting up of standards in this system.
To illustrate the problems that Paul and his group had been tackling he told us about the work they had done on improving the characterisation of the radio channel between transmitter and receiver using AI/ML in the context of 5G. The nature of this channel is shown in the above diagram; some elements are static such as reflections of buildings but others are dynamic. For example, refracted rays from rain clouds. He also pointed out that no two base stations are identical. Paul indicated that his group had employed neural networks, to improve transmission beam choice. He stressed the importance of employing an appropriate set of training/testing data.
In the second half of his talk, Paul outlined where AI could be used to advantage in redesigning some of the sub-elements in both the transmitter and receiver blocks in the context of 6G. After describing how in principle. if then appropriate training data is available, it might just be possible to use AI to design a complete transmitter/receiver pairing for a given channel. However the problem lies with having the correct data and computing power to deal with the ensuing complexity. Paul thought that this was perhaps a step too far. His approach is to concentrate on looking at sub-blocks where significant gains can be made. He told us about some work that had been done on units within the receiver block where it is thought that the Channel Estimation, Equalisation and Symbol Demapping sub-units can be replaced by a single implementable block.
The session concluded with a lively Q&A; topics ranging from the importance of reliable training/testing data to the very fundamental one of why is a lot of effort being put into 6G when 5G is still very limited in its coverage and uptake.