Genesis is a project to seed bacterial life elsewhere in the universe, founded by Claudius Gros. The aim is to build spacecraft that would deliberately seed nearby exoplanets with bacteria, in geological time, kick-starting life in the universe. This gives those planets a head start on eventually evolving something like intelligence. This would seem not only to multiply the numbers of being capable of having lives worth living, but to dramatically increase the chances of an intelligence (a species with a life most worth living) that escapes its home system to avoid black swan apocalypses, singularities, etc. and spreads indefinitely, filling the universe with happiness.
A Parfitian could well respond that it's anything but certain that such a project would produce a universe-filling life form capable of happiness, and she would be right. But choosing an action with limited information is the problem with all action selection. On the spectrum of uncertainty, seeding the universe with life > friendly singularity > avoiding biological disaster > avoiding nuclear war > electing the right people in your country > cleaning your room. The problem is that the consequence to total happiness is in the same order.
So what's the argument for the pursuing a friendly singularity, over seeding exoplanets? Or singularity issues over cleaning your room?
I submit that we don't really know. That's not to say that the singularity or bio-apocalypse can't happen. Unfortunately I'm worried that as we become more powerful but not better at estimating uncertainty, something like this will eventually be fatal for us, and Drake's omega variable will be a little clearer.
See also GENU, the Gambit of Extreme Negative Utility.