“People do not understand: the road to a catastrophe is long, and at its beginning, the end is not always clear,” – Nikolai Steinberg
To mark the 40th anniversary of the Chornobyl disaster, Nikolai Steinberg’s book “Our Long Road to April” is set to be published. The author is not merely a witness to the events of 1986. He is an expert who advanced from operator to chief engineer of the Chornobyl Nuclear Power Plant during the most challenging post-accident months and later led the commission investigating the disaster’s circumstances and causes.
In an interview, Nikolai Steinberg speaks candidly about how total secrecy, the management’s fear of admitting mistakes, and the disregard for previous incidents made the catastrophe inevitable. Critical information was withheld from operators, and production demands were always prioritized over safety.
The interview was prepared by SSTC NRS in collaboration with the Editorial Board of Uatom.org as part of a series of events commemorating the 40th anniversary of the Chornobyl disaster.
– On the anniversary of the Chornobyl disaster, your book “Our Long Road to April” will be published. You write candidly that creating it was your personal obligation to your colleagues, as for many of you the world forever split into “before” and “after” April 26, 1986. You also mention that you did not work alone, and that it was your friends and colleagues who insisted on its publication, providing invaluable support. Could you tell us a bit more about the process of working on the manuscript–how did this collaboration unfold, and how emotionally challenging it was to revisit memories of those who are no longer with us?
Let emotions be set aside; they are not the main point. Things are seen more clearly from a distance. Forty years have passed. The secondary has faded, while information recorded in written documents (scientific publications, dissertations, and memoirs) has come to the fore.
An accident is not merely a single moment of incident. It is a prolonged process that begins long before the event itself, especially when it escalates to catastrophic proportions, as in 1986. What could only be speculated about for years now has been documented in writing and can be relied upon.
Materials had been accumulating, but everything changed with the current situation. First, the war. Second, and perhaps even more importantly, the realization that the lessons of 1986 have been forgotten in our country. People do not understand: the road to a catastrophe is long, and at its beginning, the end is not always clear. I realized it was worth reminding them of this, perhaps it will prompt them to pause and reflect.
– It is well known that nuclear energy is an extremely complex technical field. How difficult was it to strike a balance between professional accuracy and making the text accessible?
I decided not to turn the book into a textbook. Moreover, for those who are interested, the internet today offers ample opportunities to explore technical details. What mattered most to me was conveying the essence of the problem – the interaction between humans and the technology they create. Above all, the emphasis is on the human role and the moral dimensions of this issue.

– In the introduction to your book, you mention that even after nearly 40 years, the events of those days still linger, and your memory constantly brings you back to them and your colleagues. To what extent did the desire to finally dispel the myth that the catastrophe was deliberately and unfairly blamed on a single person(the operator) and to explain the true systemic causes of the tragedy influence your decision to write the book?
How can one abandon what truly makes up the entirety of their life? It was recently the 55th anniversary of the day I first set foot on the site of the future Chornobyl Nuclear Power Plant: forest, river, sand. All of this had been created by human hands, through immense effort and resources. And later, with those same hands, it was transformed into a monument to human labor.
The point is not that the system’s lies were pinned on a single individual – that is merely a consequence. What matters most is that even in purely scientific and engineering work, in the management of entire industries and masses of people, moral qualities have always been and remain paramount. And when a system begins to trample on them, catastrophe becomes inevitable.
– The decision to build the RBMK-1000 reactor was made under the urgent need to expand electricity production. It is well known that the designers abandoned the creation of test stands and small-scale prototypes, launching a million-kilowatt reactor directly into serial production. Can this technological voluntarism and the lack of a thorough safety analysis during the design phase be considered the first step toward tragedy?
Yes, this set the stage for all the problems that followed. The process leading to the catastrophe could have been halted, but morality intervened: the fear of reporting “up the chain” that mistakes had been made – and that at the very least the operation should have been slowed, issues investigated, and additional safety measures developed and implemented – prevailed. Most importantly, the hidden pitfalls were concealed from the operational staff, from the operators – the very people who could have ensured safety. The fear of admitting errors outweighed integrity.
– One of the systemic problems that led to the Chornobyl disaster was the absence of nuclear safety legislation and a truly independent review process. Looking back, do you believe that establishing an independent state nuclear regulatory authority at the design stage of the RBMK-1000 would have prevented the reactor from ever being put into operation?
That is indeed the case. Disregard for the rule of law has been shaped over centuries of Russian history. This is not a story that began in 1917 – it continues to this day, regardless of who sits on the throne: a tsar, a general secretary, or a president. A nation’s culture does not change in a matter of years. We lived and worked in a system defined by expediency – and that is hardly the most reliable guiding principle. Ukraine was the first among the former Soviet states to establish nuclear legislation, and the first to create a system for nuclear safety regulation. I have no doubt that today even a young specialist would not allow an RBMK-1000 reactor to be put into operation. But that is today. Psychologically, attitudes toward nuclear safety have changed. We grew up in an environment where the very possibility of a severe accident lay beyond the limits of our understanding. We were raised with the conviction: “this cannot happen, because it can never happen.” The warning signs of a potential accident appeared more than once, yet psychologically we could not bring ourselves to look “behind the curtain.” This is a bitter truth. There is no need for self-reproach, but we must explain how to avoid such mistakes in the future. And this applies not only to nuclear energy.
– In your book, you examine in detail the issue of the operational reactivity margin (ORM). You argue that the operators on April 26, 1986 violated the operating procedures without being aware of it, as the instruments did not provide real-time information. Moreover, the procedures themselves did not indicate that, under conditions of a low ORM, the emergency protection system could effectively turn into a mechanism for accelerating the reactor. Would it be fair to say that this violation amounted to a trap embedded in the reactor’s design?
The ORM turned out to be a trap – a trap for everyone. No one set out to create it deliberately. I rule out the possibility that the specialists working alongside the Chief Designer consciously engineered it. I believe that, like us, they only came to understand it after the accident. Neither the operating personnel nor the specialists who worked with the Chief Designer were able to piece together the various conditions, each of which we encountered during operation. Those closest to recognizing the potential danger of a combination of several operational features – specifically during low-power operation following a power reduction – were the specialists working with the Scientific Supervisor. In a letter sent to the Ministry and to the Chief Designer, they expressed their concerns and proposed ways to avoid potential “problems.” However, they did not sound the alarm; they merely informed. They were not psychologically prepared. Nor was the management of the RBMK nuclear power plants prepared, and these concerns were never communicated to the operators. Instead, the Chief Designer responded that he was already aware of the issue and would take the necessary measures. Three years lay ahead. Nothing was done. Why? Because no one was truly responsible for safety. Yes, everyone bore a share of responsibility – but no one bore it fully. And on April 26, all the “features” of the RBMK-1000 operating modes came together.

– Mr. Steinberg, in the operational history of the RBMK-1000, there was an event that experts today describe as a true rehearsal for Chornobyl –the accident at Unit 1 of the Leningrad Nuclear Power Plant in 1975. However, this information was concealed not only from the public but also from the personnel of other plants. In your opinion, if the designers had not hidden the truth at the time and had openly explained the causes of that accident to the operating staff, might the catastrophe of 1986 have been avoided?
There is no such thing as a 100% guarantee, nor could there ever be. However, with a high degree of probability, the accident could have been prevented under the scenario that unfolded in April 1986. A fundamental solution to the problem would have involved developing and implementing a set of technical and organizational measures capable of significantly reducing the likelihood of such an accident. None of this was done.
– Could you elaborate on which specific technical and organizational measures you mean, and explain them in more detail?
I have spoken about this many times. These are the measures that could have prevented such a course of events and the scenario that unfolded at Unit 4 on April 26. But once again – everything must be done in a timely manner. When information is withheld from the operating staff, they are effectively forced to work “blind.” It’s like driving a car whose operating manual omits key instructions. As long as you drive straight at a steady speed, nothing happens. But sooner or later, the moment comes when you need to stop or turn – and that’s when the “surprises” appear.
Whenever safety is sacrificed for the sake of other potential benefits, it always ends badly.
– It is striking that just ten months before the Chornobyl disaster, in June 1985, a serious incident occurred at the Smolensk Nuclear Power Plant, when the reactor began to accelerate on its own and then shut down – and no investigation was conducted. You have described this as playing “Russian roulette.” Am I correct in understanding that a catastrophe could have occurred at any other RBMK plant before 1986, and Chornobyl was simply the case where they were “unlucky”?
First, this event was not formally classified as an incident. The operational occurrence was recorded nowhere except in the logbook and in the publication on the website I reference (editor’s note: the author refers to a site mentioned in his book “Our Long Road to April”).
Secondly, this itself confirms the fact that operators had become so accustomed to the quirks of the RBMK-1000 that they no longer reacted to them. The priority was to produce electricity, not to pay attention to “minor” issues that got in the way.
And thirdly – what a high level of operator qualification! This also played its part.
– Looking at all these accidents at the Leningrad, Smolensk, and Chornobyl plants, it becomes clear that the fatal flaws of the RBMK design manifested repeatedly. In your opinion, if it had not been for the culture of secrecy and the departmental monopoly on the truth, could the RBMK-1000 have been modified in a way that would have made the 1986 accident impossible?
This is one of the key moments on our road to April. Today, it is difficult to explain this to those who have not experienced that system firsthand. Information could remain secret within a single plant. In September 1982, an accident occurred at Unit 1 of the Chornobyl Nuclear Power Plant. Channel 62-44 was damaged, resulting in severe radiation consequences. Personnel from various departments were involved in cleaning the rooms and equipment. Those who went to “help out” had to sign a statement promising to tell “nothing to anyone.” However, in the turbine hall, I worked with many guys who were well-versed in reactor operations. That’s why I knew the consequences, but not why or how it happened. Even today, not all experts fully support the official version of that accident. This is just one example. And regarding what happened at other nuclear power plants – especially those under the authority of the USSR Ministry of Medium Machine Building – we knew virtually nothing.
After the accident at the Leningrad Nuclear Power Plant, which I mentioned earlier, some safety measures were implemented that we were unaware of. It is quite possible that they could have at least reduced the magnitude of the “final effect” that initiated the reactor runaway on April 26, 1986.
Read the continuation of the interview next week, covering the investigation into the causes of the Chornobyl disaster, the court proceedings, and Nikolai Steinberg’s perspectives on the present day.
The book “Our Long Road to April” is the testimony of a professional who progressed from reactor operator, unit shift supervisor, and plant department head at Chornobyl Nuclear Power Plant to its chief engineer during the most challenging post-accident months. In the book, the author reconstructs the chain of events that led to the disaster, showing that the explosion was caused not only by the design flaws of the RBMK-1000 and personnel errors, but also by the absence of a legislatively regulated system of authority and accountability for ensuring nuclear safety in the USSR. For the first time, the analysis of the scientific and technical causes of the accident is combined with an assessment of the hypocrisy of the state system, which acknowledged reactor defects behind closed doors while publicly holding the operating staff solely responsible for the disaster.
Editorial Board of Uatom.org