A spaceflight narrative unstuck

“First, a clarification: Unlike in Gravity, the 2013 film about two astronauts left adrift after space debris damages their shuttle, Sunita Williams and Butch Wilmore are not stuck in space.”

This is the first line of an Indian Express editorial today, and frankly, it’s enough said. The idea that Williams and Wilmore are “stuck” or “stranded” in space just won’t die down because reports in the media — from The Guardian to New Scientist, from Mint to Business Today — repeatedly prop it up.

Why are they not “stuck”?

First: because “stuck” implies Boeing/NASA are denying them an opportunity to return as well as that the astronauts wish to return, yet neither of which is true. What was to be a shorter visit has become a longer sojourn.

This leads to the second answer: Williams and Wilmore are spaceflight veterans who were picked specifically to deal with unexpected outcomes, like what’s going on right now. If amateurs or space tourists had been picked for the flight and their stay at the ISS had been extended in an unplanned way, then the question of their wanting to return would arise. But even then we’d have to check if they’re okay with their longer stay instead of jumping to conclusions. If we didn’t, we’d be trivialising their intention and willingness to brave their conditions as a form of public service to their country and its needs. We should think about extending the same courtesy to Williams and Wilmore.

And this brings us to the third answer: The history of spaceflight — human or robotic — is the history of people trying to expect the unexpected and to survive the unexpectable. That’s why we have test flights and then we have redundancies. For example, after the Columbia disaster in 2003, part of NASA’s response was a new protocol: that astronauts flying in faulty space capsules could dock at the ISS until the capsule was repaired or a space agency could launch a new capsule to bring them back. So Williams and Wilmore aren’t “stuck” there: they’re practically following protocol.

For its upcoming Gaganyaan mission, ISRO has planned multiple test flights leading up the human version. It’s possible this flight or subsequent ones could throw up a problem, causing the astronauts within to take shelter at the ISS. Would we accuse ISRO of keeping them “stuck” there or would we laud the astronauts’ commitment to the mission and support ISRO’s efforts to retrieve them safely?

Fourth: “stuck” or “stranded” implies a crisis, an outcome that no party involved in the mission planned for. It creates the impression human spaceflight (in this particular mission) is riskier than it is actually and produces false signals about the competencies of the people who planned the mission. It also erects unreasonable expectations about the sort of outcomes test flights can and can’t have.

In fact, the very reason the world has the ISS and NASA (and other agencies capable of human spaceflight) has its protocol means this particular outcome — of the crew capsule malfunctioning during a flight — needn’t be a crisis. Let’s respect that.

Finally: “Stuck” is an innocuous term, you say, something that doesn’t have to mean all that you’re making it out to be. Everyone knows the astronauts are going to return. Let it go.

Spaceflight is an exercise in control — about achieving it to the extent possible without also getting in the way of a mission and in the way of the people executing it. I don’t see why this control has to slip in the language around spaceflight.

A new source of cosmic rays?

The International Space Station carries a suite of instruments conducting scientific experiments and measurements in low-Earth orbit. One of them is the Alpha Magnetic Spectrometer (AMS), which studies antimatter particles in cosmic rays to understand how the universe has evolved since its birth.

Cosmic rays are particles or particle clumps flying through the universe at nearly the speed of light. Since the mid-20th century, scientists have found cosmic-ray particles are emitted during supernovae and in the centres of galaxies that host large black holes. Scientists installed the AMS in May 2011, and by April 2021, it had tracked more than 230 billion cosmic-ray particles.

When scientists from the Massachusetts Institute of Technology (MIT) recently analysed these data — the results of which were published on June 25 — they found something odd. Roughly one in 10,000 of the cosmic ray particles were neutron-proton pairs, a.k.a. deuterons. The universe has a small number of these particles because they were only created in a 10-minute-long period a short time after the universe was born, around 0.002% of all atoms.

Yet cosmic rays streaming past the AMS seemed to have around 5x greater concentration of deuterons. The implication is that something in the universe — some event or some process — is producing high-energy deuterons, according to the MIT team’s paper.

Before coming to this conclusion, the researchers considered and eliminated some alternative explanations. Chief among them is the way scientists know how deuterons become cosmic rays. When primary cosmic rays produced by some process in outer space smash into matter, they produce a shower of energetic particles called secondary cosmic rays. Thus far, scientists have considered deuterons to be secondary cosmic rays, produced when helium-4 ions smash into atoms in the interstellar medium (the space between stars).

This event also produces helium-3 ions. So if the deuteron flux in cosmic rays is high, and if we believe more helium-4 ions are smashing into the interstellar medium than expected, the AMS should have detected more helium-3 cosmic rays than expected as well. It didn’t.

To make sure, the researchers also checked the AMS’s instruments and the shared properties of the cosmic-ray particles. Two in particular are time and rigidity. Time deals with how the flux of deuterons changes with respect to the flux of other cosmic ray particles, especially protons and helium-4 ions. Rigidity measures the likelihood a cosmic-ray particle will reach Earth and not be deflected away by the Sun. (Equally rigid particles behave the same way in a magnetic field.) When denoted in volts, rigidity indicates the extent of deflection the particle will experience.

The researchers analysed deuterons with rigidity from 1.9 billion to 21 billion V and found that “over the entire rigidity range the deuteron flux exhibits nearly identical time variations with the proton, 3-He, and 4-He fluxes.” At rigidity greater than 4.5 billion V, the fluxes of deuterons and helium-4 ions varied together whereas those of helium-3 and helium-4 didn’t. At rigidity beyond 13 billion V, “the rigidity dependence of the D and p fluxes [was] nearly identical”.

Similarly, they found the change in the deuteron flux was greater than the change in the helium-3 flux, both relative to the helium-4 flux. The statistical significance of this conclusion far exceeded the threshold particle physicists use to check whether an anomaly in the data is really real rather than the result of some fluke error. Finally, “independent analyses were performed on the same data sample by four independent study groups,” the paper added. “The results of these analyses are consistent with this Letter.”

The MIT team ultimately couldn’t find a credible alternative explanation, leaving their conclusion: deuterons could be primary cosmic rays, and we don’t (yet) know the process that could be producing them.

Suni Williams and Barry Wilmore are not in danger

NASA said earlier this week it will postpone the return of Boeing’s crew capsule Starliner back to ground from the International Space Station (ISS), thus leaving astronauts Barry Wilmore and Sunita Williams onboard the orbiting platform for (at least) two weeks more.

The glitch is part of Starliner’s first crewed flight test, and clearly it’s not going well. But to make matters worse there seems to be little clarity about the extent to which it’s not going well. There are at least two broad causes. The first is NASA and Boeing themselves. As I set out in The Hindu, Starliner is already severely delayed and has suffered terrible cost overruns since NASA awarded Boeing the contract to build it in 2014. SpaceX has as a result been left to pick up the tab, but while it hasn’t minded the fact remains that Elon Musk’s company currently monopolises yet another corner of the American launch services market.

Against this backdrop, neither NASA nor Boeing — but NASA especially — have been clear about the reason for Starliner’s extended stay at the ISS. I’m told fluid leaks of the sort Starliner has been experiencing are neither uncommon nor dire, that crewed orbital test flights can present such challenges, and that it’s a matter of time before the astronauts return. However, NASA’s press briefings have featured a different explanation: that Starlier’s stay is being extended on purpose — to test the long-term endurance of its various components and subsystems in orbit ahead of operational flights — echoing something NASA discussed when SpaceX was test-flying its Dragon crew capsule (hat-tip to Jatan Mehta). According to Des Moines Register, the postponement is to “deconflict” with space walks NASA had planned for the astronauts and to give them and their peers already onboard the ISS to further inspect Starliner’s propulsion module.

This sort of suspiciously ex post facto reasoning has also raised concerns NASA knows something about Starliner but doesn’t plan on revealing what until after the capsule has returned — with the added possibility that it’s shielding Boeing to prevent the US government from cancelling the Starliner contract altogether.

The second broad reason is even more embarrassing: media narratives. On June 24, Economic Times reported NASA had “let down” and “disappointed” Wilmore and Williams when it postponed Starliner’s return. Newsweek said the astronauts were “stranded” on the ISS together with a NASA statement further down the article saying they weren’t stranded. The Spectator Index tweeted Newsweek’s report without linking to it but with the prefix “BREAKING”. There are many other smaller news outlets and YouTube channels with worse headlines and claims feeding a general sense of disaster.

However, I’m willing to bet a large sum of money Wilmore and Williams are neither “disappointed” nor feeling “let down” by Starliner’s woes. In fact NASA and Boeing picked these astronauts over greenhorns because they’re veterans of human spaceflight who are aware of and versed with handling uncertainties in humankind’s currently most daunting frontier. Recall also the Progress cargo capsule failure in April 2015, which prompted Russia to postpone a resupply mission scheduled for the next month until it could identify and resolve some problems with the launch vehicle. Roscosmos finally flew the mission in July that year. The delay left astronauts onboard the ISS with dwindling supplies as well as short of a crew of three.

The term “strand” may also have a specific meaning: after the Columbia Space Shuttle disaster in 2003, NASA instituted a protocol in which astronauts onboard faulty crew capsules in space could disembark at the ISS, where they’d be “stranded”, and wait for a separate return mission. By all means, then, if Boeing is ultimately unable to salvage Starliner, the ISS could undock it and NASA could commission SpaceX to fly a rescue mission.

I can’t speak for Wilmore and Williams but I remain deeply sceptical that they’re particularly bummed. Yet Business Today drummed up this gem: “’Nightmare’: Sunita Williams can get lost in space if thrusters of NASA’s Boeing Starliner fail to fire post-ISS undocking”. Let’s be clear: the ISS is in low-Earth orbit. Getting “lost in space” from this particular location is impossible. Starliner won’t undock unless everyone is certain its thrusters will fire, but even if they don’t, atmospheric drag will deorbit the capsule soon after (which is also what happened to the Progress capsule in 2015). And even if it is Business Today’s (wet) “nightmare”, it isn’t Williams’s.

There’s little doubt the world is in the throes of a second space race. The first happened as part of the Cold War and its narratives were the narratives of the contest between the US and the USSR, rife with the imperatives of grandstanding. What are the narratives of the second race? Whatever they are, they matter as much as rogue nations contemplating weapons of mass destruction in Earth orbit matters because narratives are also capable of destruction. They shape the public imagination and consciousness of space missions, the attitudes towards the collaborations that run them, and ultimately what the publics believe they ought to expect from national space programmes and the political and economic value their missions can confer.

Importantly, narratives can cut both ways. For example, for companies like Boeing the public narrative is linked to their reputation, which is linked to the stock market. When BBC says NASA having to use a SpaceX Dragon capsule to return Wilmore and Williams back to Earth “would be hugely embarrassing for Boeing”, the report stands to make millions of dollars disappear from many bank accounts. Of course this isn’t sufficient reason for BBC to withhold its reportage: its claim isn’t sensational and the truth will always be a credible defence against (alleged) defamation. Instead, we should be asking if Boeing and NASA are responding to such pressures if and when they withhold information. It has happened before.

Similarly, opportunist media narratives designed to ‘grab eyeballs’ without considering how they will pollute public debate only vitiate narratives, raise unmerited suspicions of conspiracies and catastrophe, and sow distrust in sober, non-sensational articles whose authors are the ones labouring to present a more faithful picture.

Featured image: Astronauts Sunita Williams and Barry Wilmore onboard the International Space Station in April 2007 and October 2014, respectively. Credit: NASA.

Where is the coolest lab in the universe?

The Large Hadron Collider (LHC) performs an impressive feat every time it accelerates billions of protons to nearly the speed of light – and not in terms of the energy alone. For example, you release more energy when you clap your palms together once than the energy imparted to a proton accelerated by the LHC. The impressiveness arises from the fact that the energy of your clap is distributed among billions of atoms while the latter all resides in a single particle. It’s impressive because of the energy density.

A proton like this should have a very high kinetic energy. When lots of protons with such amounts of energy come together to form a macroscopic object, the object will have a high temperature. This is the relationship between subatomic particles and the temperature of the object they make up. The outermost layer of a star is so hot because its constituent particles have a very high kinetic energy. Blue hypergiant stars, thought to be the hottest stars in the universe, like Eta Carinae have a surface temperature of 36,000 K and a surface 57,600-times larger than that of the Sun. This isn’t impressive on the temperature scale alone but also on the energy density scale: Eta Carinae ‘maintains’ a higher temperature over a larger area.

Now, the following headline and variations thereof have been doing the rounds of late, and they piqued me because I’m quite reluctant to believe they’re true:

This headline, as you may have guessed by the fonts, is from Nature News. To be sure, I’m not doubting the veracity of any of the claims. Instead, my dispute is with the “coolest lab” claim and on entirely qualitative grounds.

The feat mentioned in the headline involves physicists using lasers to cool a tightly controlled group of atoms to near-absolute-zero, causing quantum mechanical effects to become visible on the macroscopic scale – the feature that Bose-Einstein condensates are celebrated for. Most, if not all, atomic cooling techniques endeavour in different ways to extract as much of an atom’s kinetic energy as possible. The more energy they remove, the cooler the indicated temperature.

The reason the headline piqued me was that it trumpets a place in the universe called the “universe’s coolest lab”. Be that as it may (though it may not technically be so; the physicist Wolfgang Ketterle has achieved lower temperatures before), lowering the temperature of an object to a remarkable sliver of a kelvin above absolute zero is one thing but lowering the temperature over a very large area or volume must be quite another. For example, an extremely cold object inside a tight container the size of a shoebox (I presume) must be lacking much less energy than a not-so-extremely cold volume across, say, the size of a star.

This is the source of my reluctance to acknowledge that the International Space Station could be the “coolest lab in the universe”.

While we regularly equate heat with temperature without much consequence to our judgment, the latter can be described by a single number pertaining to a single object whereas the former – heat – is energy flowing from a hotter to a colder region of space (or the other way with the help of a heat pump). In essence, the amount of heat is a function of two differing temperatures. In turn it could matter, when looking for the “coolest” place, that we look not just for low temperatures but for lower temperatures within warmer surroundings. This is because it’s harder to maintain a lower temperature in such settings – for the same reason we use thermos flasks to keep liquids hot: if the liquid is exposed to the ambient atmosphere, heat will flow from the liquid to the air until the two achieve a thermal equilibrium.

An object is said to be cold if its temperature is lower than that of its surroundings. Vladivostok in Russia is cold relative to most of the world’s other cities but if Vladivostok was the sole human settlement and beyond which no one has ever ventured, the human idea of cold will have to be recalibrated from, say, 10º C to -20º C. The temperature required to achieve a Bose-Einstein condensate is the temperature required at which non-quantum-mechanical effects are so stilled that they stop interfering with the much weaker quantum-mechanical effects, given by a formula but typically lower than 1 K.

The deep nothingness of space itself has a temperature of 2.7 K (-270.45º C); when all the stars in the universe die and there are no more sources of energy, all hot objects – like neutron stars, colliding gas clouds or molten rain over an exoplanet – will eventually have to cool to 2.7 K to achieve equilibrium (notwithstanding other eschatological events).

This brings us, figuratively, to the Boomerang Nebula – in my opinion the real coolest lab in the universe because it maintains a very low temperature across a very large volume, i.e. its coolness density is significantly higher. This is a protoplanetary nebula, which is a phase in the lives of stars within a certain mass range. In this phase, the star sheds some of its mass that expands outwards in the form of a gas cloud, lit by the star’s light. The gas in the Boomerang Nebula, from a dying red giant star changing to a white dwarf at the centre, is expanding outward at a little over 160 km/s (576,000 km/hr), and has been for the last 1,500 years or so. This rapid expansion leaves the nebula with a temperature of 1 K. Astronomers discovered this cold mass in late 1995.

(“When gas expands, the decrease in pressure causes the molecules to slow down. This makes the gas cold”: source.)

The experiment to create a Bose-Einstein condensate in space – or for that matter anywhere on Earth – transpired in a well-insulated container that, apart from the atoms to be cooled, was a vacuum. So as such, to the atoms, the container was their universe, their Vladivostok. They were not at risk of the container’s coldness inviting heat from its surroundings and destroying the condensate. The Boomerang Nebula doesn’t have this luxury: as a nebula, it’s exposed to the vast emptiness, and 2.7 K, of space at all times. So even though the temperature difference between itself and space is only 1.7 K, the nebula also has to constantly contend with the equilibriating ‘pressure’ imposed by space.

Further, according to Raghavendra Sahai (as quoted by NASA), one of the nebula’s cold spots’ discoverers, it’s “even colder than most other expanding nebulae because it is losing its mass about 100-times faster than other similar dying stars and 100-billion-times faster than Earth’s Sun.” This implies there is a great mass of gas, and so atoms, whose temperature is around 1 K.

All together, the fact that the nebula has maintained a temperature of 1 K for around 1,500 years (plus a 5,000-year offset, to compensate for the distance to the nebula) and over 3.14 trillion km makes it a far cooler “coolest” place, lab, whatever.

Solutions looking for problems

There’s been a glut of ‘science projects’ that seem to be divorced from their non-technical aspects even when the latter are equally, if not more, important – or maybe it is just a case of these problems always having been around but this author not being able to unsee it these days.

An example that readily springs to mind is the Bharati intermediary script, developed by a team at IIT Madras to ease digitisation of Indian language texts. There is just one problem: why invent a whole new script when Latin already exists and is widely understood, by humans as well as machines? Perhaps the team would have been spared its efforts if it had consulted with an anthropologist.

Another example, also from IIT Madras: it just issued a press release announcing that a team from the institute that is the sole Asian finalist in a competition to build a ‘pod’ for Elon Musk’s Hyperloop transportation concept has unveiled its design. On the flip side, Hyperloop is a high-tech, high-cost solution to a problem that trains and buses were designed to address decades ago, and they remain more efficient and more feasible. Elon Musk has admitted he conceived Hyperloop because he doesn’t like mass transit; perhaps more reliably, his simultaneous bashing of high-speed rail hasn’t gone unnoticed.

Here is a third example, this one worth many crores: the Indian Space Research Organisation (ISRO) wants to build a space station and staff it with its astronauts. The problem is nobody is sure what the need is, maybe not even ISRO, although it has been characteristically tight-lipped. There certainly doesn’t seem to be a rationale beyond “we want to see if we can do it”. If indeed Indian scientists want to conduct microgravity experiments of their own, like what are being undertaken on the International Space Station (ISS) today and will be on the Chinese Space Station (CSS) in the near future, that is okay. But where are the details and where is the justification for not simply investing in the ISS or the CSS?

It is very difficult to negotiate a fog without feeling like something is wrong. We built and launched AstroSat because Indian astronomers needed a space telescope they could access for their studies. We will be launching Aditya in 2020 because Indian astrophysicists have questions about the Sun they would like answered. But even then, let us remember that a (relatively) small space telescope is too lightweight an exercise compared to a full-fledged space station that could cost ISRO more money than it is currently allocated every year.

Sivan’s announcements are also of a piece with those of his predecessors. In fact, the organisation as such has announced many science missions without finalising the instruments they are going to carry. In early 2017, it publicised an ‘announcement of opportunity’ for a mission to Venus next decade and invited scientists to submit pitches for instruments – instead of doing it the other way around. While this is entirely understandable with a space programme that is limited in its choice of launchers, this pattern has also prompted doubts that ISRO is simply inventing reasons to fly certain missions.

Additionally, since Sivan has pitched the Indian Space Station as an “extension” of ISRO’s human spaceflight programme, we must not forget that the human spaceflight programme itself lacks vision. As Arup Dasgupta, former dy. director of the ISRO Space Applications Centre, wrote for The Wire in March this year:

… while ISRO has been making and flying science satellites, … our excursions to the Moon, then Mars and now Gaganyaan appear to break from ISRO’s 1969 vision. This is certainly not a problem because, in the last half century, there have been significant advances in space applications for development, and ISRO needs new goals. However, these goals have to be unique and should put ISRO in a lead position – the way its use of space applications for development did. Given the frugal approach that ISRO follows, Chandrayaan I and the Mars Orbiter Mission did put ISRO ahead of its peers on the technology front, but what of their contribution to science? Most space scientists are cagey, and go off the record, when asked about what we learnt that we can now share with others and claim pride of place in planetary exploration.

So is ISRO fond of these ideas only because it seems to want to show the world that it can, without any thought for what the country can accrue beyond the awe of others? And when populism rules the parliamentary roost – whether under the Bharatiya Janata Party or the Indian National Congress – ISRO isn’t likely to face pushback from the government either.

Ultimately, when you spend something like Rs 10,000-20,000 crore over two decades to make something happen, it is going to be very easy to feel like something was achieved at the end of that time, if only because it is psychologically painful to have to admit that we could get such a big punt wrong. In effect, preparing for ex post facto rationalisation before the fact itself should ring alarm bells.

Supporters of the idea will tell you today that it will help industry grow, that it will expose Indian students to grand technologies, that it will employ many thousands of people. They will need to be reminded that while these are the responsibilities of a national government, they are not why the space programme exists. And that even if the space programme provided all these opportunities, it will have failed without justifying why doing all this required going to space.

SpaceX nears big test to return human spaceflight to America

Since the end of the space shuttle era, no American spacecraft has ferried American astronauts to the International Space Station. While NASA has no problem with letting Russia stepping in and transporting the astronauts, escalating tensions with the Asian giant over its de facto annexation of Ukraine’s Crimea have left politicians bristling with the idea of having to depend on the Russians. The issue has become symbolic of the USA’s pending, but not quite here, comeback.

A big step toward rectifying it comes on May 6, Wednesday, when SpaceX will conduct the important pad-abort test (PAT) for its Dragon crew-capsule, unveiled in May 2014. The test is one of the final steps before the capsule is certified by NASA, which awarded a multibillion-dollar contract to SpaceX in 2014 to ferry astronauts to and from the ISS. It adds to the $1.6-billion commercial resupply services deal to transport cargo to, again, the ISS.

The PAT on May 6 will check if Dragon will be able to secure its crew if some misfortune were to befall the launchpad or the launch. The capsule has been fit with seven seats (one for each astronaut it can house). One of them will be occupied by a sensor-rigged dummy nicknamed “Buster”. During the test, eight SuperDraco engines* integrated with Dragon will fire for six seconds and take the capsule to a height of about 5,000 feet. Then, Dragon will descend using two reefed drogue parachutes and three canopies into a patch of water about 1.5 km from the launchpad. Finally, after recovery, it will be transported to SpaceX’s facility in McGregor, Texas, for analysis.

The entire exercise is expected to take less than two minutes, with most of the action occurring in the first 30 seconds, although it will happen when SpaceX feels “ready” within a launch window from 7 am to 2.30 pm (EST) on May 6. The occasion will mark the first time eight SuperDracos will be fired in unison. Each of these thrusters is fueled by monomethyl hydrazine and nitrogen tetroxide. Together, they will generate a propulsive yield of 54,430 kg – a figure SpaceX spokesperson Hans Koenigsmann had smugly called “a lot of kick” during a briefing on May 1. The total weight of the stack (including the propellant) will be 11,115 kg.

On April 21, NASA announced on its site,

SpaceX will perform the test under its Commercial Crew Integrated Capability (CCiCap) agreement with NASA, but can use the data gathered during the development flight as it continues on the path to certification. Under a separate Commercial Crew Transportation Capability (CCtCap) contract, NASA’s Commercial Crew Program will certify SpaceX’s Crew Dragon, Falcon 9 rocket, ground and mission operations systems to fly crews to and from the International Space Station.

The PAT had first been scheduled to happen in early-April, but was postponed after some faults were found in the helium-pressurization bottles of the Falcon 9 rocket during testing. Once it was rectified, the higher-priority launch of the TurkmenistanAlem 52E/MonacoSat satellite (Turkmenistan’s inaugural telecom satellite) had to be carried out first, which finally happened on April 27. However, the Falcon 9 will not be involved in the PAT.

Dragon is scheduled to undergo its first non-crewed orbital-flight test in 2016, followed by a crewed test in 2017. That’s the same timeframe in which Boeing – which also received a contract in 2014 – is expected to finish certifying its commercial crew program.

To stay on track, SpaceX has demanded $1.2 billion a year from NASA. Unsurprisingly, the number was met with skepticism by Congress, which particularly questioned the need for two crew vehicles apart from Soyuz instead of just one more. A part of that sentiment might’ve been allayed when, in October 2014, an Antares rocket exploded moments after takeoff while, earlier this week, a Progress 59 spacecraft launched by Russia tumbled out of control in space and fell back to Earth. Both failures deprived the ISS crew of essential supplies.

NASA, on the other hand, doesn’t mind the money. By late-2017 or 2018, “There’s going to be a bit of a race … about who’s going to be flying the first NASA crew member from the Florida Space Coast,” Kathy Lueders, the head of NASA’s Commercial Crew Program, told Florida Today. “This is going to be exciting.”

*… all 3D-printed!

Featured image: The interior of SpaceX’s Dragon crew-capsule. The seating configuration of the seven astronauts it can carry at a time is shown. Credit: SpaceX