John V. Tucker

John Vivian Tucker (born 4 February 1952) is a British computer scientist and expert on computability theory, also known as recursion theory. Computability theory is about what can and cannot be computed by people and machines. His work has focused on generalising the classical theory to deal with all forms of discrete/digital and continuous/analogue data; and on using the generalisations as formal methods for system design; based on abstract data types and on the interface between algorithms and physical equipment.

Biography

[edit]

Born in Cardiff, Wales, he was educated at Bridgend Boys' Grammar School, where he was taught mathematics, logic and computing. He read mathematics at University of Warwick (BA in 1973), and studied mathematical logic and the foundations of computing at University of Bristol (MSc in 1974, PhD in 1977). He has held posts at Oslo University, the CWI Amsterdam, and at Bristol and Leeds Universities, before returning to Wales as Professor of Computer Science at Swansea University in 1989. In addition to theoretical computer science, Tucker also lectures on the history of computing and on the history of science and technology and Wales.

Tucker founded the British Colloquium for Theoretical Computer Science in 1985 and served as its president from its inception until 1992. He is a Fellow of the British Computer Society and editor of several international scientific journals and monograph series. At Swansea, he has been Head of Computer Science (1994–2008), Head of Physical Sciences (2007–11) and Deputy Pro Vice Chancellor (2011–2019). He is Member of Academia Europaea. Outside of Computer Science, Tucker has been a Trustee of the Welsh think-tank, the Institute of Welsh Affairs and the chair of the Swansea Bay branch. He is also a Trustee of the South Wales Institute of Engineers Educational Trust, and the Gower Society.

Professor Tucker is married to Dr. T.E. Rihll, formerly a Reader in Ancient History at Swansea University.

In the early 1990s, he began to lobby for a national academy for Wales. In 2008 a process to create such an academy began sponsored by the then University of Wales. Professor Tucker is a Founding Fellow of the Learned Society of Wales and in July 2010 he was appointed as its inaugural General Secretary, a post he held until May 2017.

Work on computability and data types

[edit]

Classical computability theory is based on the data types of strings or natural numbers. In general, data types, both discrete and continuous, are modelled by universal algebras, which are sets of data equipped with operations and tests. Tucker's theoretical work tackles the problems of: how to define or specify properties of the operations and tests of data types; how to program and reason with them; and how to implement them.

In a series of theorems and examples, starting in 1979, Jan Bergstra and Tucker established the expressive power of different types of equations and other algebraic formulae on any discrete data type, guided by theorems of the form:

On any discrete data type, functions are definable as the unique solutions of small finite systems of equations if, and only if, they are computable by algorithms.

Their program comprehensively classified specification methods for data types. The results combined techniques of universal algebra and recursion theory, including term rewriting and Matiyasevich's theorem.

For the other problems, he and his co-workers have developed two independent disparate generalisations of classical computability/recursion theory, which are equivalent for many continuous data types.

The first generalisation, created with Jeffrey Zucker, focuses on imperative programming with abstract data types and covers specifications and verification using Hoare logic. For example, they showed that:

All computable functions on the real numbers are the unique solutions to a single finite system of algebraic formulae.

The second generalisation, created with Viggo Stoltenberg-Hansen, focuses on implementing data types using approximations contained in the ordered structures of domain theory.

The general theories have been applied as formal methods in microprocessor verifications, data types, and tools for volume graphics and modelling excitable media including the heart.

Work on computability and physics

[edit]

Since 2003, Tucker has worked with Edwin Beggs and Felix Costa on a general theory analysing the interface between algorithms and physical equipment. The theory answers various questions concerning:

  1. how algorithms can be boosted by special purpose physical devices acting as "oracles";
  2. how algorithms control physical experiments that are designed to make measurements.

By transforming the idea of oracle in computability theory, they combine algorithmic models with precisely specified models of physical processes. For example, they ask the question:

If a physical experiment were to be completely controlled by an algorithm, what effect would the algorithm have on the physical measurements made possible by the experiment?

Their central idea is that, just as Turing modelled the human computer in 1936 by a Turing machine, they model a technician, performing an experimental procedure that governs an experiment, by a Turing machine. They show that the mathematics of computation imposes fundamental limits on what can be measured in classical physics:

There is a simple Newtonian experiment to measure mass, based upon colliding particles, for which there are uncountably many masses m such that for every experimental procedure governing the equipment it is only possible to determine finitely many digits of m, even allowing arbitrary long run times for the procedure. In particular, there are uncountably many masses that cannot be measured.

Work on digital society

[edit]

Since 2004, Tucker and Victoria Wang have studied the nature and role of digital data in personal, social and organisational contexts, especially surveillance. First, they have created a theory of phatic technologies and integrated it into the theory of modernity developed by Anthony Giddens. Second, they have a theory of monitoring people and objects that is used to analyse many surveillance contexts and processes; this has led to mathematical models of monitoring systems derived from abstract data type theory.

Work on history of science and technology

[edit]

In 2007 Tucker founded the History of Computing Collection at Swansea University. He has lectured on the history of computation since 1994, with interests in computing before computers, and theories of data and computation. He is a founding member of the editorial board of the Springer book series History of Computing. He also lectures on the history of science and technology in Wales and is a founding member of the editorial board of the University of Wales Press book series Scientists of Wales.

References

[edit]
  1. J A Bergstra and J V Tucker, Equational specifications, complete term rewriting systems, and computable and semicomputable algebras, Journal of the ACM, Volume 42 (1995), pp1194–1230.
  2. V Stoltenberg-Hansen and J V Tucker, Effective algebras, in S Abramsky, D Gabbay and T Maibaum (eds.), Handbook of Logic in Computer Science, Volume IV: Semantic Modelling, Oxford University Press (1995), pp357–526.
  3. V Stoltenberg-Hansen and J V Tucker, Computable rings and fields, in E Griffor (ed.), Handbook of Computability Theory, Elsevier (1999), pp363–447.
  4. J V Tucker and J I Zucker, Computable functions and semicomputable sets on many sorted algebras, in S Abramsky, D Gabbay and T Maibaum (eds.), Handbook of Logic in Computer Science, Volume V: Logic and Algebraic Methods, Oxford University Press (2000), pp317–523.
  5. J V Tucker and J I Zucker, Abstract computability and algebraic specification, ACM Transactions on Computational Logic, Volume 5 (2004), pp611–668.
  6. J A Bergstra and J V Tucker, The rational numbers as an abstract data type, Journal of the ACM, 54: 2 (2007), Article 7. https://dl.acm.org/doi/10.1145/1219092.1219095.
  7. J A Bergstra, Y Hirschfeld and J V Tucker, Meadows and the equational specification of division, Theoretical Computer Science, 410 (2009), 1261–1271. doi:10.1016/j.tcs.2008.12.015
  8. E J Beggs, J F Costa, B Loff and J V Tucker, Computational complexity with experiments as oracles, Proceedings Royal Society Series A, 464 (2008) 2777–2801.
  9. E J Beggs, J F Costa, B Loff and J V Tucker, Computational complexity with experiments as oracles II: Upper bounds, Proceedings Royal Society Series A, 465 (2009) 1453–1465.
  10. E J Beggs, J F Costa and J V Tucker, Limits to measurement in experiments governed by algorithms, Mathematical Structures in Computer Science, 20 (2010) 1019–1050.
  11. Victoria Wang and J V Tucker, Phatic systems in digital society. Technology in Society, 46 (2016), 140-148, http://dx.doi.org/10.1016/j.techsoc.2016.06.002
  12. Victoria Wang, Kevin Haines and J V Tucker, Deviance and Control in Communities with Perfect Surveillance – The Case of Second Life, Surveillance and Society, 9 (2011) 31-46, https://doi.org/10.24908/ss.v9i1/2.4096.
  13. Victoria Wang and J V Tucker, ‘I am not a number’: Conceptualising identity in digital surveillance Technology in Society, 67, November 2021, 101772, https://doi.org/10.1016/j.techsoc.2021.101772.
  14. J V Tucker, Robert Recorde: data, computation and the Tudor knowledge economy, in G Roberts and F Smith (ed), Robert Recorde: Life and Work, University of Wales Press, 2012, 165–187.
  15. J V Tucker, Richard Price and the History of Science, Transactions of the Honourable Society of the Cymmrodorion, New Series 21 (2017), 69–86.
  16. J V Tucker, The Computer Revolution and Us: Computer Science at Swansea University from the 1960s, Swansea University Centenary Essays, https://collections.swansea.ac.uk/s/swansea-2020/page/computer-science.
[edit]