An RPG Project With My Son

My son has been pestering me for awhile to gamemaster a game of GURPS with him. I’ve decided to create an all-new science-fictional world with him to play in. Recently, we were playing about and came up with a planet orbiting a moderate-sized star(F, G or K class main-sequence) which was itself a distant companion in a multiple system with a neutron star. My son has been playing a lot of Starbound and we’ve been watching a lot of videos of Subnautica, No Man’s Sky, Elite Dangerous and Space Engine explorations. These things could and probably will flavor what we’re making together…

Over the next few weeks, I’ll be posting the kind of stream-of-consciousness write-ups we’ve created to better understand what we both want out of this world. Then we’ll need to get down to the nitty-gritty game-mechanics stuff. After getting some playing time under our belt, and doing a bit of editing, I may post some game logs or game log-based fiction.

I’m really looking forward to doing some creative work with my son. My daughter isn’t terribly interested in the RPG element, but she does want to help out some with the writing. We’ll see how that goes. Once we get started, she might decide the game looks like fun!

I’ll start posting tomorrow with my initial write-up on the planet. This is all work in progress, so we’ll see how much survives to playtime and beyond.

Thank you for your interest,
The Astrographer

Posted in Science Fiction, World Building, Writing | Tagged , , , , , , , | Leave a comment

Astrographer’s Notebook – The Crystalglass Forest

This was a fairly recent note. Not everything in my notebook dates back to the twentieth century🤣. This was actually posted on December 27th of 2017, in fact.

— The Crystalglass Forest —

On a planet who’s interior has cooled somewhat beyond maintaining plate tectonics, the lifeforms have evolved a number of adaptations to the increasing scarcity of atmospheric carbon.

One adaptation found particularly among primary producers in high-latitude habitats is the crystalglass forest.

To survive the long, cold winters, the biological expense of maintaining living foliage in the absence of light is untenable. It is also unreasonable to expend resources during the intense, but short growing season creating entirely new foliage. Added to that, the fact that, even in the extremest environments decomposition microbes will feast on any source of carbon that isn’t strongly protected and you have problems.

The crystalglass “trees” have dealt with all of these problems by forming a thick “bark” and a foliage consisting of multitudes of thin, sharp “needles” composed of tough, transparent, crystalline borosilicate fibers.

During the growing season, pores within these structures circulate an aqueous solution of chlorophyll-analogue bearing cells and other cells intended to break down and rebuild the borosilicate structure. Where there are breaks in the integument, those construction cells will leak out and begin to build new foliage. Gradually, as new needles are built, the structure of old needles is melded together into new bark protecting the living woody inner tissues of the plant. As the living inner parts of the plant grows, the innermost layers of the borosilicate integument are broken down to make space for growth, as well as to free up boron to build more foliage.

As the long, cold darkness of winter settles in, the trees hunker down, withdrawing water and carbon-rich cells into the protected inner parts. The largely opaque to translucent green foliage and skin of the plant begin to bleach into transparent crystals.

Younger and smaller plants will withdraw their living tissues entirely beneath the warming embrace of the ground. Living tissues can be seen as dark masses of greenish- to reddish brown opacity deep in the trunk and heaviest branches of older larger trees.

The appearance of crystalglass plants generally follows a fairly standard form. The smallest plants, regardless of their longer-term fate, will consist of a living taproot with a spray of crystalline needles right at ground level. Larger plants will form a trunk and perhaps branches shooting skyward. Cracks in the crystal bark will usually spray forth needles of foliage, but some of the larger species will avoid foliating the shaded lower parts of the plant in favor of developing thicker, more resistant bark as living tissues emerge from the ground.

Although originating in the higher latitudes, the relative lack of sensitivity to most predation and fire has lead the plants to adaptive radiation into warmer biomes. Some of these plants have abandoned borosilicate foliage in favor of faster-growing living foliage, but retain the thick armored bark.

This was, to some degree, inspired by reading about Epona quite some time ago. Even on first reading(okay, probably second or third reading, but whatever…), it struck me that there would probably be some organisms that found a use for, never particularly scarce silicon. Probably not in any sort of energy-producing metabolic process, but perhaps as a structural materials. An early development of that was a sort of silicate coral in submarine environments. I could see my imaginary planet having such things as an independent evolutionary line from the crystalglass forest organisms. It wouldn’t be a total no-brainer, the use of silicates would probably be a compromise between the benefits of a large strong structure using less scarce materials and the energy cost to produce such things. If the energy costs were two unfavorable, it would still be conceivable that plants and perhaps even some animals might evolve to use physically pulverized rock crystals glued together with an organic cement as a skeleton or shell. The idea seems both plausible enough and interestingly alien enough to be worth examining in greater depth.

What are your thoughts? Please feel free to share your thoughts and questions by commenting.

Thank you,
The Astrographer

Posted in Aliens, Planetary Stuff, Science Fiction, World Building | Tagged , , , , , , , | Leave a comment

Using GIS Tools and Data 2

An overview map of the region in it’s final state(click to enlarge).

Starting with what we had done by the end of the last post, I would like to continue on to doing some actual analyses using QGIS, GRASS, and my other GIS-type tools. This post has been a long time coming, partly as always due to my laziness, but also because I was having a difficult time getting a lot of my software up and running on my “new” computer. Rather than trying to do everything from inside of QGIS, I decided to work separately in QGIS and GRASS(mostly…).

Although I’m not confident in the realism of the elevations file I composed in the last post. The elevations are quite low. Univariate analysis shows that the range of values is -3m(I’m not going to concern myself with what’s under the oceans except to avoid problems with basin fill, flow algorithms and the like) to 971m(which is genuinely very low). I’m not sure whether it needs to have less emphasis on the 1.0 exponent values or not(as it is, the low areas appear very relatively rugged, but until I have higher altitudes(5-9km), its difficult to say). At the very least, I probably need to add a large, high-exponent pass(perhaps 5-8,000m of exponent 5.0). Anyway, I’m using it as is, for now. We’ll see where this gets us.

Okay, SAGA’s hydro-modeling tools look very appealing, but I can’t set up Boot Camp with Windows XP, I’m in no hurry to buy a modern version of Windows, SAGA works poorly on WINE(on my Mac – I’ve heard tell of it working well, but not around here…), and the QGIS-internal versions of SAGA, GRASS, etc. have not been working at all well for me. Grrr… Looks like I’m back to the drawing table.

A closer look at the central region of the continent to get a better look at the streams there(click to enlarge).

The most important thing I want is flow mapping that takes differential rainfall into account. Next, I want d-inf flow mapping. After messing around with a lot of GRASS’s raster hydrology modules, and producing a lot of unsatisfactory results, I settled on the r.watershed module. This requires an input elevation raster. I have that. It also uses a “flow” raster, which is described in the manual thus,”flow   Input map: amount of overland flow per cell. This map indicates the amount of overland flow units that each cell will contribute to the watershed basin model. Overland flow units represent the amount of overland flow each cell contributes to surface flow. If omitted, a value of one (1) is assumed. “. Yeah. This sounds good. Unfortunately, it doesn’t do Tarboton’s d-inf flow mapping. That’s unfortunate, but it does provide both multiple flow direction and deterministic 8-directional flow models, so I’m hopeful I can at least create better incise flow erosion than Wilbur.

The available water map(rainfall-evaporation, my model doesn’t even try to deal with variable runoff infiltration) was generated using a method derived from this. Roper’s sech(lat) model looked good, but first I had to convert from watts per square meter of energy absorption/release due to evaporation/precipitation to meters of rainfall/evaporation per year. The input flow map only really needs relative contributions of each cell to overland flow, but having proper units seems like it might be useful down the line.

Once I figured this out(correctly, I hope), I entered the following formulae into r.mapcalc:
evaporation = 2.307 / (1 + ((y() – 15)/17)^2 + 2.307 / (1 + ((y() + 15)/17)^2)
This would generate the “evaporation” raster map.
precipitation = 1.957 / (1 + ((y() – 33)/15)^2) + 6.292 / (1 + (y()/4)^2) + 1.957 / (1 + ((y()+33)/15)^2)
This would generate the “precipitation” map. As my maps were already in latlong format and correctly located(more or less: more about that if I ever get SAGA working…), the y() internal variable represents latitude.

A closer view of the northern regions to better view the streams there(click to enlarge).

Looking at the results, rainfall seemed awfully heavy under the subtropical high pressure zone, but I decided to go with it for now. At least the net_moisture(generated in r.mapcalc with the formula: net_moisture = precipitation – evaporation) seemed properly dry under the STHZ, although it actually seemed a little too dry generally and the width of the desert band(r.mapcalc desert = net_moisture < 0.0) seemed… excessive. I’m following Carl Davidson’s climate-modeling efforts with great interest, but I’ll play with this for now. Later biome development might demand knowledge of precipitation and temperature(and by extension, evaporation) for summer and winter, rather than just annual averages and totals, but this will have to do for now. I’m largely just trying to see how this will work.

In an attempt to further refine the effect a bit I created a further version of the map with slightly higher precipitation at higher altitudes ( modified_precipitation = precipitation * (1 + 0.001*elevation*sin(2*y()))  ). Then I added a bit of precipitation to a blurred version of the sea areas( coastal_precipitation = modified_precipitation + 0.05*sea_raster_blurred ). The exact details of developing that sea raster are embarrassing and given the limited quality of the results, I’m not going to share them. Suffice to say, you can generate an initial sea raster with r.mapcalc( sea_raster = elevation<= 0.0 ), you can dilate the result with r.neighbors and a method of ‘maximum’, and blur the result of that with r.resamp.filter with filter “gauss,hermite”. In retrospect, I probably should have blurred the elevation raster as well. Ehhh…

Thinking about that coastal_net_moisture map, I don’t think I want it propagating

A closer view of the southernmost parts of the continent showing some of the extensive deserts and the wet tropical southern peninsula(click to enlarge).

negative flow contributions through and out of desert areas. Logically, it would be nice to have the negative values taken out of flow passing through a cell(evaporation), but having negative flow accumulations propagate into neighboring cells would not be reasonable. So I think I need to create a version of coastal_net_moisture floored at zero. In r.mapcalc, use the formula r.mapcalc expression=”coastal_net_moisture_floor = ( coastal_net_moisture@PERMANENT >= 0.0 ) ? coastal_net_moisture@PERMANENT : 0.0″. That won’t take into account extreme evaporation of the stream itself, but it should still be more valid than dealing with negative flows.

In addition, I really don’t want to calculate stream flows under the ocean. The submarine areas are not accurately represented anyway, as I did not have bathymetric data available, and for convenience simply set the ocean areas to -3 meters on the elevation map. So, here, I will create a version of the basin filled elevation map with all submarine areas set to null. There is definitely a module to do this, but I forget what it is. Anyway, the Map Calculator is sooo versatile, and it’s good to know how to get around in there. I should research more standard workflows, but for now I will use the formula r.mapcalc expression=”elevation_set1_basinfill_land = elevation_set1@PERMANENT >= 0.0 ? elevation_set1_basinfill@PERMANENT : null()”.

I already did all of this before I started writing this. Call it a flash-back. On to r.watershed!

I set the input elevation raster map toleration_set1_basinfill_land,  and the input flow to coastal_net_moisture_floor. Minimum size of exterior watershed basin I will leave at default(though I may need to rejigger it if the results aren’t satisfactory, the manual says it’s a sensitive parameter😟), the rest of the inputs I’ll leave at default or off.

Next we set the outputs. The accumulation output I will name accumulation_MFD. This is to differentiate it from the possible future accumulation_D8 I might create in the future. I will also create topographical_index_MFD, stream_power_index_MFD, and all the rest I will name the same as the parameter name with an _MFD subscript appended. The stream output parameter I will name stream_segments_MFD for clarity. I will use the ‘b’ option to beautify flat ares, and I will leave the convergence at its default 5 for now.

I’m pretty satisfied with the results. I have a few good streams heading through the desert. I have quite a few decently long, but not too crowded streams running through the moister northern regions. And the wet, but rugged deep southern peninsula has several short streams. The cumulative flow in the desert regions goes negative even where streamm segments are found, which is not a little bit odd. My best guess is because the MFD is routing some flow away from the streams and the stream segment generator is trying to force downhill flow to go all the way to the ‘sea’. I don’t see any nasty straight segments, which is awesome. The desert regions, though they seem overly large to my eye(probably due to the weaknesses inherent in my very lame attempt at a precipitation and evaporation model, are pleasantly devoid of streams. Because this was generated from a real world DEM of a fairly moist area, the deserts are pretty extensively incised with stream-eroded features. This is a downside of using re-purposed real-world elevation data. It might work well to simplify the elevations with gaussian blur and apply erosion to the result. One good tool, which I really don’t have, would be a good aeolian erosion filter. Convert some of the flatter desert regions into saharan fields of marching barchans.

Some of the good behaviors, as well, probably, as some of the bad behaviors could easily be lain at the feet of the MFD model rather than a single deterministic flow direction. For better or worse, some can be blamed on the use of real-world data, even heavily massaged. A better test might be to use this on generated noise-based elevations.

I neglected a lot of the layers that were created in this process to create the images shown here. Also, beyond the narrow band of yellow to represent the desert regions, there is no climatic data here. Only a shaded relief based on the un-basin filled elevation map, a nicely massaged desert map in yellows at 25% opacity, a basin-filled land elevation-colored map at 43% opacity, a landmass vector map at 50% opacity, and the generated stream segments map at full opacity, with a monochrome blue color table are shown overlaid.

Future directions in research would be a much(much, MUCH, MUCH)better climate model, use of less natural initial elevations, use of accumulated flow or stream power index raised to a fractional power and possibly blurred to erode those elevations into a better approximation of naturalistic shapes, and you know, some actual biome data generated from climate and elevations. With this, I might be able to create something akin to the ‘satellite view’ of tectonics.js, only hopefully a bit more aesthetic.

I hope you found this to be a good introduction to using some of the available free GIS tools, and that you will download GRASS and QGIS and other open source GIS tools and use them to help your worldbuilding efforts. Thank you for reading,
The Astrographer

A full size view of the map found at the top of the page.

Posted in Mapping, Planetary Stuff, World Building | Tagged , , , , , , , , , , | 3 Comments

Measuring Up

So this is a bit of one-off worldbuilding. In the spirit of NANOWRIMO, I’m just going to throw this together superfast and let the chips fall wherever they land.

So, I was reading this on the Zompist bulletin board, when I was struck by an idea of one way in which to develop a set of measurement standards for an imaginary world.

I started by grabbing,”StarGen,” a variation on the old Accrete program from the website. It may not have all of the variety of a more modern planet generation program, but the results should be plausible, or at least not altogether risible. I had it generate 3,000 systems, only returning the ones that contained at least one “habitable” planet. I then examined the systems generated. I settled on this one, a system whose fourth planet was terrestrial and just different enough from Earth to seem interesting to me. There are better ways, programmatic and otherwise, to generate interesting planets, but this way does have the virtue of being fast.

The star is pretty similar to the Sun with 0.92 of its mass, 0.67 of its luminosity and an age of 5.21 billion years(leaving 8.472 billion years remaining on the main sequence). Let’s call it Holman

The fourth planet in this system(cleverly named Holman IV by the United Planets Astronomical Survey Service – UPASS) has a mass 0.603 that of Earth, a surface gravity of 8.224 m/s2 and an equatorial radius of 5,406 km. Its “average” surface temperature is a balmy 11.4ºC under about 400 millibars of atmosphere and with a hydrospheric coverage of 64.7%(probably more significant figures than will be reflected by the mapping process, figure in the range of 60%≤hydrosphere<70%). The most important parts which I came here for being it’s day of 23.79 hours(85,644s, disappointingly similar to Earth’s own day length of 86,400), and its year length of 270.85 Earth days(23,401,440s or 273.24 local days).

Traditionally, the local sophonts, lets call them,”Gwaps,” use a base 12 numbering system. They divide their day into 12 equal segments(6 of daylight and 6 of night at the equinox), each 7,137s(118.95min) long. These could be considered equivalent to hours. In at least one of the local cultures a bell is rung in the social/religious/cultural center to mark these demarcations, thus their word for this timespan translates as,”bells.” Of course, the day is also divided into other, more ad hoc demarcations: daylight and night, of course, thirds and sixths, but these are less significant. More significant is the 144th part of the bell, referred to in translation as a,”grosseth bell,” or simply,”grosseth”. The grosseth is 49.5625s long(in theory, in practice, given the Gwaps roughly renaissance level of technological development, about 50 seconds is generally more precise than the actual measurement). A dozen grosseth, referred to simply as a dozenth, since it is by definition also 1/12th part of a bell(in practice about 10 minutes), is a frequently used, though somewhat casual measure of time. The smallest unit of time in any regular use by the Gwaps is the 1728th part of a bell or about 4 seconds.

Longer periods of time would be the,”twelveday,” roughly equivalent to a week. Like a week, each day of the twelveday cycle has a traditional name. If this program generated moons and if this planet had any, I’m sure they would throw a whole different monkey wrench into the system, but as it is the number of local days in a year don’t really fit with the base-12 motif. Nature does that. No respect for the holy perfection of mathematical systems.

Most Gwap cultures divide up the local year in one of two ways. Some divide it into 22 named twelvedays and a special holiday season that is 9 days long except every fourth year when it is 10 days long. Others divide it into 12,”months,” nine of them 23 days long and 3 of them 22 days long, with a special carnival day every fourth year.

Spacial measurement is a bit more complicated. For large distances the standard is the distance the most common local beast of burden(called the wog) can travel in a local day. They’re a bit slower moving than an Earth horse, but they can maintain a pace of about 8km/hr throughout the daylight period without needing rest, so a,”day’s travel,” comes to about 95km(say 92-98km in practice, local mapping practices aren’t up to more precision than that, anyway). A 1728th part of a day’s travel would be about 55m(a mazwa), and a 144th part of that would be about 38cm(called a minot). By coincidence, a particularly tall Gwap can be around 152cm in height, making it seem to be a good standard of comparison. In practice, the average Gwap is around 144cm in height, so the measure tends to come up a bit short, about 36cm. A typical Gwap can jump about 190cm in a single hop, which leads to a parallel unit of length measure used pretty much only in athletic competitions of the yawm(about 1.3cm).

Mass or weight measurement is surprisingly rationalized to the measures of time and length. It is based on the weight of water contained in a cylindrical barrel one minot(36cm) high by one minot in diameter(in practice, about 35-40kg or 288-330N, depending on the locally-preferred minot).

Areas of land are typically measured in square mazwa parcels(hagama, 3025m2 or 3/4 acre), while bolts of cloth are measured in square minots(ela minot, 1296cm2). Beyond that, there are few other common measurements.

For something that started out as a bit of idle thought while working on other things, and only fleshed out with the most basic of world information, this was rather a lot of detail. Now to learning more about the Gwaps, themselves and their world.

Posted in Aliens, Science Fiction, World Building | Tagged , , , , , | Leave a comment

Using GIS Tools and Data

There is now a second part here

Recently(…-ish) I found something on the Zompist forum that I found interesting. Gareth3 was using an existing real world data(in this case, Stewart Island off of the southern tip of New Zealand) scaled up to represent an entire continent.

There are a number of problems with that rescale. First off, with a simple rescaling of the existing elevations, tall, steep mountains become wide, gentle slopes. To some degree, this can be handled by also scaling up the elevation range to something a bit more continent-worthy. The second thing is to raise the existing elevations to an exponent greater than one. This will tend to make the high points sharper and pointier and flatten out the lower areas between. Secondly, there’s the problem of climates. There is some discussion in the thread about the perils of using existing maps. The entire island is fairly small and looks to be covered pretty much entirely with what I figure would be a maritime west coast climate. An actual continent would have a variety of climates, including, in Gareth3’s modeling, significant desert areas, covered on existing maps with many streams. Our own stream mapping(down the line) would be based on the modified elevations and climate mapping(Precipitation – Evaporation for each grid point and propagated downhill).

Here is a little something I cooked up to import and modify the elevation map. I’ll describe my methodology, as it could prove useful to others. Affairs of climate and stream mapping will be dealt with later as I’m currently putting my GIS tools back in order after a major crash and system upgrade.

First, I went to the EarthExplorer site. I used to use, but that seems to have gone away 😡 . To download data, you need to be logged in, but setting up a profile is free. So, why not?

Under the Search Criteria tab, I selected the Address/Place tab and entered “stewart island” and tapped Show. Click on “Stewart Island / Rakiura, Southland 9818, New Zealand”. Zoom in a lot and tap Clear Coordinates. Click on the map to define the outline of the area you want to use. Now select the data sets tab. In the hierarchical list click on Digital Elevation, then SRTM. Under SRTM check SRTM Void Filled and SRTM Water Body Data. Then click on the Results tab. You should get four elevation data sets, and four water body data sets. Click on the footprints for each of these to insure that they cover the desired area., if they’re where they should be click on the download button for each data set in the SRTM water body data. Then click on each of the SRTM void filled elevation data sets, in the resulting window, click the Download button next to TIFF, although it looks like Wilbur can handle DTED.

Now that you have the data, this next step will require QGIS. You can get QGIS free here.
Next add the four raster layers to QGIS.From the main menu, select Raster>Miscellaneous>Merge…
Select the four input files, then browse a location to place the resulting merged elevation file. I named it, and selected VTP .bt as the type. Now, I like to double click on the new layer in the legend area to open properties. In the Style tab, under load min/max values, I select Min/Max and Actual(slower), then hit Load and OK. This just makes the display look nicer.

Next, I’m going to clip the coverage area of the final raster to just the island. Start by clicking the New Shapefile Layer tool. This will create an empty new vector graphics object. Give it a name and set its type to polygon. We will fill that with the boundaries of our desired area. With your new shapefile selected, tap Toggle Editing in the Digitizing toolbar. This will allow you to define the area of the island’s raster. Click the Add Feature tool in the Digitizing toolbar. Now click in the water areas around the island to create a new polygon encircling the island. As a guide, you can create a “landarea” raster using Raster Calculator under the raster menu. Simply set the expression to “merged_elevations > 0.0”. Place that below the shapefile layer as a guide. Tap Toggle Editing again to finish your polygon.

Now pull down Raster>Extraction>Clipper… to actually clip the elevations. Set the input file to your merged_elevations, set the Clipping Mode to Mask Layer with the shapefile you just created as the Mask Layer. Name the resulting file to something like “clipped_elevations”.

Open Wilbur and open up
Use the paintbrush with value set to something tiny like 0.001 and the Operation set to Maximum to paint over areas below zero within the island. These may be a fault in Wilbur, I’m not sure. I don’t see them in QGIS.

Next Surface>Locate>Flip Vertically to get roughly the same arrangement as has been used previously in this thread.

We’ll need to use an exponent operator to contract the mountain areas. Realistically, most continents would have a bit less highland, so that’s what I’m going to do here. Select>From Terrain>Height Range… from Minimum: -1 to Maximum: 0. With that selection, Filter>Fill>Set Value… Set Value: -1. Otherwise the exponential operation tends to mess things up. Deselect. Next, Filter>Mathematical>Exponent… Set the exponent to 2 for the Land(above sea level), and 0.5 for the Sea(below sea level), 0 for the Sea Level. Preserve Height should be set to Absolute Low set to -1, Absolute High set to 791.

Use rectangle select to select an area very close all around the island. Now, Surface>Crop to Selection.

Finally, to place this where it belongs, sort of, Surface>Find Min/Max… Top: 60, Left: 80, Right: 140, Bottom: 0. File>Save As… set the type to Binary Terrain Surface(*.bt). Name it something like “squared_elevations”.

Back to QGIS. While we used Wilbur to assign the new coordinates which were stored in the VTP .BT format, but the projection is hardwired as UTM. So, when the elevations file is loaded we need to reassign a WGS 84 latitude-longitude “projection”(Raster>Projections>Assign Projection… as the Input file, EPSG:4326 as the Desired SRS. This will be an equirectangular “latlon” coordinate system). Looking back on it, I think I can manage the projection described in this thread using georeferencing and probably a polar equidistant projection.

Finally, I’m going to use the Raster Calculator one more time. I’ll use the expression: 4*squared_elevation + 2.375*unsquared_elevations to give me a range. That will give us our final elevations.

For now, I’m going to leave it at that. There’s a lot more you can do, particularly when you’ve created a vector representation of the landmasses. The image below has streams everywhere and labels helping to identify some broad regions. None of this is final, particularly the streams(They were made using the D8 River Finder in Wilbur, so I’m not too excited by the results. Also, Wilbur has no really good way to edit down rivers in dryer places), but it’s good as a demonstration.

Posted in Mapping, World Building | Tagged , , , , , , , , , , , | 2 Comments

PlanetCell on Blender Cycles – Improved Speed

Previously, I began work on a PlanetCell noise-based world map generator. Work on PlanetCell kind of came to a screeching halt, in large part because rendering proved to be so slow. For my purposes, it seemed entirely possible to make decent map render with Blender’s internal renderer(in fact, it has a better selection of procedural textures installed), but I can’t figure out how to composite multiple textures in internal. I recently saw a YouTube video by Andrew Price on speeding up Cycles renders. With that information under my belt, I decided to try some initial speed optimizations for PlanetCell. I’d consider this version 0.1.1 as I haven’t changed any of the actual texture generation procedures. I figure after I’ve got a fully-developed world map generator I should also optimize the OSL code. But for now, I’m just going to go over Mr. Price’s optimizations.

Some optimizations I found to be impossible for my old, sad computer. My GPU, sadly, is not recognized by Blender. So I’m stuck in CPU rendering. This is a big optimization, and I’m hoping my next computer will have a Blender-recognized CUDA device.

I know that I only have one light source and I have no desire to deal with reflections, transmissions and such, so I set all bounces to 0. This makes rendering a bit less processor-intensive. I also set the integrator preset to Direct Light. I unchecked shadows and all caustics and set filter glossy to 1.0 to reduce noise. I reduced samples from 10 to 4. I raised the internal light strength all the way to 800 and cranked the film exposure down to 0.03 to put more light energy on the subject but keep the effective brightness down about where it had been. Also on the film tab, I increased the filter width to 3 pixels. Again to damp down noise. Also, following the advice of Andrew Price, I reduced the tile size to 32×32. I tried 16×16, but this seemed faster.

I did make a few minor alterations to the texture of the planet surface. Inside the Sea Coloring group, I reduced the distorting noise to 1 octave of detail. I also reduced the Detail of the Sea Coloring itself to 2 octaves. To make up for these changes, I doubled the Distortion strength to 0.2. Mostly for aesthetic effect, because I’m still unhappy with the look of the sea areas, I decreased the scale of the Sea Coloring group to 9,6,9 and changed the scale of the internal distortion noise to 3,3,3. Reducing octaves in noise that isn’t really all that satisfactory anyway should speed up rendering. I don’t expect I’ll be adding much additional complexity to the Sea Coloring group. If anything, I may choose to ditch it altogether in favor of a solid color. For now, though this will have to do.

While the Height generator is still unsatisfactory, I’ll save further work on that for another time. In fact, I might prefer generating elevation data externally, loading it as an image texture, and using PlanetCell as merely a beautifying step. I will probably keep the internal generator as an option depending on how well I get it working.

I guess I’ll refer to this as 0.1.2, because I had some work on optimizations which I’ve lost along the way and I did, ultimately, make minor changes to the texture.

While the earlier version took just about all day(6-8 hours as I recall _I ain’t doing it again_), this one took only 43 minutes and 35 seconds. A major improvement! The actual map image produced is shown at the top of the page.

Although my initial thought was that this was too minor an update to bother posting, after seeing how much of a speedup I got, I decided to put it up for those interested. It’s right here on Google Drive.

Please feel free to use the comment system to leave any comments or questions. Thank you,
The Astrographer

Posted in Mapping, World Building | Tagged , , , , , , | Leave a comment

A Few New Things About Wilbur

This post is just a quick aside about a few things I noticed about Wilbur while working on my last post.

The conventional rendering style on Wilbur.

• Typically, when I want to show hillshade on land, but not on the water, I go through a somewhat involved procedure. First, I save a seamask for the surface I’m working on. Then I load a copy of the fully unshaded surface image(Texture>Shader Setup… General tab and in Display Type pulldown select “Height Code”), and load that into Photoshop. Finally, I create a fully shaded surface image by selecting “Lighted”. I load the shaded image as a layer above the unshaded layer and use seamask as a layer mask for that. Not horribly complicated, but I’ve learned an even quicker, simpler method.

Open the Shader Setup… window under the Altitude tab. In both the the Land and Sea areas, there is a text field labeled Opacity. By default, this is 0, But if you set it to 1, the shading for that part of the map will be altogether hidden. You can also select values between 0 and 1 to tone down the hillshading without eliminating it altogether.

The Photoshop methodology may still be somewhat more flexible. Best of all, if you save a copy of the Lighted map with a straight white background and composite it in separately in Photoshop. Still, this provides a quick method with good results. It also makes the Wilbur feature set just marginally more in line with Fractal Terrains.

Altitude shading is completely obscured over water when the opacity in the Sea section is set to 1. No Photoshoppery involved.

• For temperature, I used a very loose approximation of insolation. I should have taken the fourth root of that, e.g. sqrt(sqrt(cos(asin(y)))) rather than cos(asin(y)). Unfortunately, this causes a very narrow band of maximum values on the southernmost edge of the map. These should all be minimal values(-1 on a -1,1 scaling). This is frustrating. Am I doing something wrong in my setup or is it a bug in the program? I dinnae.

• As a second, and ultimately better and more flexible method for generating temperature(and hopefully precipitation and other) models, I tried using Mathematica and Matlab/Octave. Unfortunately, most of the high-bit export formats available to these are not importable by Wilbur. Even .mat, which Wilbur has, is not portable with the versions of .mat exportable by my version of either of those apps. Unfortunate. I might have to convert with gdal.We’ll see if I can make this work…

• While fBm, Ridged Multifractal, and to a lesser degree Heteroterrain all work perfectly well with the default settings of H=1.0, Lacunarity=1.9, Offset=1.0 and fgain=2.0, the other fractal types in the Calculate Height Field filter are less successful. After a lot of experimentation, I found that H=0.5, Lacunarity and Offset=2.1, and fgain=1.5 worked quite well for Multifractal. This was a  particularly finnicky fractal by the way. I had a hard time finding settings that worked well.

This is the Multifractal type with H=0.5, Lacunarity=2.1, Offset=2.1 and fgain=1.5. The texture also has a reduced shading on the sea areas(Opacity=0.9).

Next, for the Hybrid Multifractal, I used parameters given by Doc Mojo himself. H=0.25, Offset=0.7.

This is the Hybrid Multifractal type as generated with Musgrave’s preferred settings. Not bad, but not great.

Because of the small value of H, there seems to be little spectral variability. It makes for a somewhat different effect than I’m used to, but it adds another tool to the kit.

It should be noted that not all of the height field generator types use all of the parameters. fBm uses only H, lacunarity, octaves, random seed, sphere center or position(which can be used to alter the specific appearance without changing the random seed), sphere radii or size(which determines the base spatial frequency of the fractal), scaling(which determines the range of heights produced by the fractal function), spherical evaluation(which determines whether the fractal is generated on the 3d surface of a sphere or a 2d planar surface), spherical area(which determines what slice of the spherical surface is covered by the 3d generator) and sphere axis(which… seems to do… nothing…). These parameters are all used by all of the generator types except recursive subdivision(plasma) and math function(neither of which are covered here).

Heteroterrain, which is described on page 500 of Texturing and Modeling-A Procedural Approach, 3rd edition(I can’t seem to find an available version on the internet…) as Statistics by Altitude. To quote Musgrave,”The observation that motivated my first multifractal model is that, in real terrains, low-lying areas sometimes tend to fill up with silt and become topographically smoother, while erosive processes may tend to keep higher areas more jagged. … We accomplish our end by multiplying each successive octave by the current value of the function. Thus in areas near zero elevation, or “sea level,” higher fre- quencies will be heavily damped, and the terrain will remain smooth. Higher eleva- tions will not be so damped and will grow jagged as the iteration progresses. Note that we may need to clamp the highest value of the weighting variable to 1.0, to pre- vent the sum from diverging as we add in more values.” This one adds an Offset parameter.

Hybrid multifractal is described in the above book(also starting on page 500, as,”My next observation was that valleys should have smooth bottoms at all altitudes, not just at sea level. It occurred to me that this could be accomplished by scaling higher frequencies in the summation by the local value of the previous frequency… Note the offset applied to the noise function to move its range from [−1, 1] to something closer to [0, 2]. … You should experiment with the values of these parameters and observe their effects.” Again, this one has an Offset parameter. It displays heterogeneous terrains(surprise): flat plains, foothills and jagged mountains.

Ridged multifractal is described as similar to the previous hybrid multifractal. It uses the offset parameter, but also an additional gain parameter(fgain in Wilbur). The parameters given by Musgrave are the same as the default parameters(H:1, offset:1, gain:2), not surprising as Ridged Multifractal is the default type for the Height Field Calculator.

Multifractal is described as a “multiplicative multifractal”. At the time Texturing and Modeling, 3rd edition was published, it was not well understood. It uses all of the fBm parameters, of course, and also offset. Although, I can’t find gain in Musgrave’s description or the code he presents, fgain does appear to have an effect in the Wilbur version of Multifractal. I can’t pretend to know why…

All of the fractal functions can produce good results, but many are very sensitive to exact settings, and for some the default settings are simply awful. They’re all definitely worth a bit of playing with.

I have in the past suggested that when trying to paint fractal effects into specific locations(generally using selections), it’s best to use homogeneous fractals(notably fBm). I’m not entirely convinced of this anymore. If the base frequency and scaling is chosen, heterogeneous fractals can add quite a bit of interest and “pizzazz” to your mountain regions and other terrains.

Hopefully, these notes will prove useful.

Thank you,
The Astrographer

Posted in Mapping, World Building | Tagged , , , , , , , , | Leave a comment

Creating Attractive Satellite-style Textures

This is the final, fully-textured hillshaded map image.

This post is going to be a short overview of a method for creating a plausibly-realistic and visually attractive satellite-view of a habitable planet.

First, I run tectonics.js until I find an attractive arrangement of continental features. Or… just, you know, continents. In Photoshop or GIMP, I create selections of high elevations and areas that appear to be highlands in the bedrock shader to serve as masks in later manipulations.

In Wilbur, I began by loading a selection of the land areas and slightly feathering it. With this selection, I added some low relief with Filter>Calculate Height Field… set to Hetero Terrain type and Replace operation. Then I loaded and feathered each of the other masks one by one and added additional relief using the add operation of Calculate Height Field. Each time I will change the random seed and increase the sphere radii. Somewhere between the lowest highlands and the higher mountains, I will change the type of the fractal from heteroterrain to Ridged Multifractal, so that the higher elevations will be more craggy.

Finally, I will load an image of the plates shader from tectonics.js to serve as a guide to placing additional mountains. Unless you paid close attention to the movements of the features as tectonics.js was running, you’ll have to simply choose some of the plate boundaries as areas to place additional mountains and hills. These can be added by creating feathered selections and applying calculated fractals as above.

I painted in additional details with the raise and lower tools, and then applied an iterative process of basin fill, erosion and noise to complete the elevation map. Save this set of elevations as a BT and a PNG Surface. You’ll also want a flat seas version, so Filter>Height Clip… Min: sea level(probably zero), Max: higher than the highest mountain on the map(Which you can find using Window>Histogram…). Also save this flattened seas elevation set as a BT and a PNG.

Now comes the real meat of this post: using the best features of Wilbur and Photoshop/GIMP to create a realistic-looking planetary surface texture.

The latitude shader in Wilbur still doesn’t work terribly well as of version 1.86, so I won’t be using that. Besides, I have come up with a method to use the much more powerful compositing tools of any good image processing app.

To do the ice maps, start with the elevations map in meters with seas flattenned, use Filter>Mathematical>Wilbur Scale(Multiply)… by an appropriate lapse rate(for Earth, with elevations in meters -0.0065ºC/m works). For Earth, this should result in a variation in temperature due to altitude of about 0 to -56K.  Next add in a sea level temperature variance. In Wilbur select Filter>Calculate Height Field… type: Math Function, operation: Add, expression: cos(asin(y)), sphere center: xyz(-1,-1,-1), sphere radii: xyz(2,2,2), spherical evaluation: checked(as always), surface scaling: to range: about 30 to -25 to allow room for some random variation. Extremes of mean annual temperature on Earth work out, after some very cursory research, to 35ºC to -26ºC as adjusted for sea level. Similarly cursory research showed that while mean annual temperatures are effected by altitude, and while the variations around that mean get much wider with distance inland and latitude, the means largely aren’t sensitive to distance from the sea. So as I’m just working out mean annual temperatures, I don’t need to take raw distances from the ocean into account. In order to add a little random variation, I ran a heteroterrain fractal with spherical radius of 8, only 4 octaves, and a scaling of 5 to -2.

A map of temperatures on this planet. I seem to be using too much noise, and the lack of definition of continental areas makes me think that I do not have enough midrange elevations. Red is hottest, down to yellows, greens and black. The sharp line between black and light blue is at 0ºC. Blues darken through purples towards black at the coldest temperatures.

I used an unshaded version of this map with a different color scheme to show a spatial distribution of mean annual temperature. I thought the image was effective, but I’m not happy with the amount of noise and it showed that the hypsometric distribution was not altogether satisfactory. Still working on that! I also think I might have done better with a ridged multifractal here.

With this temperature map, Select>From Terrain>Height Range… to create ice masks. For a  persistent snow and land ice mask, set the maximum to 0.0ºC. Salt water freezes at a lower temperature. For this mask, set the maximum to -13ºC. When applying ice and snow layers in your image editor, place the land ice layer above most everything else, but below any relief shading layer. I placed the sea ice layer above the hillshade to lighten it a bit. When I did this, I simply used the black and white masks directly, selecting the black areas with magic wand and deleting them. It might be better and certainly more flexible to use these masks as layer masks on some kind of nicely-textured ice color layers. I haven’t tried that yet, but it seems like a worthwhile experiment.

I got a lot of information from examining the shader code in tectonics.js, which I would find useful in shading my own model. For bedrock coloring, the modeller uses a gradient from mafic rock(RGB(50,45,50)) to felsic rock(RGB(190,180,185), based on height. For this, I used the original flattened seas elevation map with a two color altitude shader in Texture>Shader Setup… The colors would be the mafic color near sea level with felsic at higher altitudes. In retrospect I could have added noise to the elevation map for variety, but I didn’t this time. I also played with the colors a bit to make it more aesthetic to my eye.

The sediment color map is a latitude map generated as described above using the Math Function method of the calculate height field filter. I again used a two color gradient with a peat color of RGB(100,85,60) for the cold polar areas and a sand color of RGB(245,215,145) as we get into warming areas. This was based solely on latitude. I could have used the temperature map with a different noise seed, but instead I ignored altitude.

What tectonics.js refers to as mineral fraction, basically describes the amount of sediment covering the underlying bedrock. I used a straight flat seas elevation map with a two color gradient, ranging from white at sea level to black at the highest points. I used this in my image editor as a layer mask for the sediment color layer placed above the existing bedrock color layer generated above. Again, I could have used some additional noise to add variety to this, but I decided to keep things kind of simple.

Next, I created a vegetation color layer. Although tectonics.js uses a single jungle color of RGB(30,50,10) everywhere, simply varying the alpha strength over the underlying sediment and rock layers, I used a latitude-based temperature model as described above, with a two color gradient of a darker, more saturated variation on the jungle color at low latitudes and a lighter, greyer variation for high latitudes. I could have, and probably should have used the full latitude-altitude model, but I got lazy.

Rather than using a latitude based model, taking precipitation and temperature into account to build the vegetation mask, I decided to hand paint it, based on the satellite view map derived from tectonics.js. I’m thinking that, once again, the modelling process would result in a more interesting and possibly more plausible texture, but its a bit more complicated to derive a plausible precipitation map(the simulator uses k1 + k2cos(6lat) + k3(1-lat/90) – k4dist_to_cont_edge*sin(lat), complicated and the user has to work out values for k1, k2, k3 and k4, for the sake of simplicity, k4 can probably be set to 0.0) and the prospect of combining the temperature and precipitation maps(again, figuring out weights) was daunting. I also wanted a small positive altitude element to the precipitation, with varying strength depending on latitude. For now, I’ll just paint this in. Later, I might come up with something less uninteresting…

I started by creating a layer mask for my vegetation color layer. I filled the mask with black. Next I loaded a seamask selection. Make sure Sea White is not checked when generating the mask using Texture>Gray Maps>Sea Mask…, or if you’re using an already-generated white sea mask, simply invert the selection. We want our strokes constrained to the land area. Next fill in the land area with white.

Now, paint the broad strokes into the layer mask using a large, soft brush in multiply mode and a very light gray color. This will be to reduce the vegetation in the polar areas, and then to roughly follow the latitudes where most of the desert areas lie. If you were working in Wilbur, this would be around 30º N and S, but your image editing app probably won’t show such nice geographic coordinates.

I also painted out the mask in areas under ice, though with much less success. Ice-covered areas should really have no vegetation at all. Also high mountains should have reduced vegetation, although in desert areas, the might get a little more due to orogenic precipitation.

Next, using progressively smaller and darker multiply brushes, paint out the desert and less vegetated areas in more detail. Finally, using a river flow map from Wilbur as a guide stroke along the rivers in a smallish very dark gray screen brush. This will show floodplain areas that are considerably more lush in otherwise less vegetated areas, but will have little perceptible effect in heavily forested or jungle areas. Rather than hand-painting the areas around rivers, one could directly apply a green-shaded layer controlled by a blurred copy of the river flow map.

I created fully shaded flat maps with rivers shown clearly as in the image at the top of the page. For use in texturing a globe for 3d renderings, hide the river and hillshade layers and save the resulting image. This will be shaded in the rendering using a bump map or normal map derived from the elevations map. Typically, I would save the elevations from Wilbur as a BT and then use GDAL to convert it to a nice 32-bit tiff, but for those without gis tools or the willingness to learn them(GDAL is free, by the way), you can simply save it as a 16-bit PNG Surface directly from Wilbur(unfortunately, at least in Cycles, 16-bit PNGs don’t seem to work well, so use your favorite graphics editor to convert to tiff).


Good modelers can make excellent photorealistic imagery with the Blender Internal Renderer, but I can’t do much better than cartoonish. The render proved disappointing, but darn quick.


The Cycles rendering took a lot longer, but my skills with Cycles are still growing. In any case, I really should have upsampled the underlying texture images much earlier in the process.

The process is still very much a work in progress, but it’s leading in a direction I find very promising. A few notes.

• Upsample the textures. A lot.

•• Probably start with low-res, and heavily blurred masks. Add noise and apply erosion, then resample. Do this in multiple steps, upsampling both the masks and the heightfield, then adding noise and erosion. Erosion may be too slow at high resolutions, but hopefully 8192×4096 or perhaps 16384×8192 will be sufficient to avoid artifacting in the final render.

• Figure out how to create precipitation models analytically, rather than by hand.

• Create vegetation masks from temperature, precipitation, river flow and distance imagery.

Hopefully, the next try will be good enough for a final model…

Thank you for reading this,
The Astrographer

Posted in Mapping, Planetary Stuff, Projects, World Building | Tagged , , , , , , , , , , | Leave a comment

Repost: Toponymy

Some of the best mappers I know will quail in terror when the time comes to add names to their maps. Sometimes it’s just the problem of clearly labeling the various features of their newly created landscape without making it look bad. That’s a whole article of its own, and one I may be uniquely unsuited to writing. That’s a pretty common problem, but the even more common problem is the one of creating believable, consistent names for all of those features. In spite of the difficulty, and it is considerable, of clear and attractive labeling on a map, I’ve seen my share of beautifully rendered lorem ipsum names.

Actually, in a pinch, lorem ipsums aren’t that bad. At least they generally have a consistent sound. It’s difficult to create a large number of consistent sounding names off the top of your head.

This post isn’t intended as a complete or authoritative exploration of the process of creating names for map features. It is simply a quick survey of methods and ideas I have used myself or heard discussed. The intention of this post is to put up some possibly good ideas and hopefully spur some discussion.

The process of creating names often varies depending upon whether the world being created is intended to be alien, fantasy, or something from the very far future or distant past on the one hand or something like a near(-ish) future colony planet on the other hand.

In the former case you can have a very free hand. You can chose pleasant-sounding nonsense for places you intend to give a pleasant appearance(this could perfectly well be a false impression), like “Assaremin” and unpleasant-sounding names for places you wish to give a bad impression(this, too, can be a red herring), like “Kuntogoloth“. You can even give many names that would be equally be suited for a more familiar or realistic milieu, like “Pleasant Valley.” It could be assumed that these are translations of native names and not the actual names. This last variation gives the map-maker or author a way to differentiate between locales familiar to the POV characters and those more foreign locations. If written from the point of view of the elves, a particular coastal inlet might be referred to as “Western Bay”, while the same place might be called “Taga Winsiem” if written from the viewpoint of halflings far from home. On the other hand, the piece of land that the halflings refer to familiarly as “The Boroughs” might be written as “Mushar” if visiting elves are the viewpoint characters. In either case, “Taga Winsiem” would be the proper name in elvish for the bay and “Mushar” the proper name in halfling… -ish for the land. The translations give the readers more of a sense of familiarity shared with the major characters and the stranger sounds hint to the characters lesser familiarity with other places. Today, for the benefit of brevity, I’m going to discuss the problem of creating interestingly unique and self-consistent names for foreign languages. I’ll save the question of what those names might or might not mean for a later post.

With strange alien worlds, we have the problem as mentioned previously, of making the names sound both sufficiently consistent within a culture and sufficiently distinct between different cultures. One way of dealing with the first problem is with the use of name- or word-generation software such as WordBuilder, or sites like Fantasy Name Generator, Seventh Sanctum or the Star Wars Random Name Generator. These vary a lot in usefulness. Alfar’s WordBuilder has the greatest flexibility in terms of being able to build distinct and consistent vocabularies, but it does take a bit more work than the others to generate anything at all. Fantasy Name Generator and the Seventh Sanctum generators have a good variety of different patterns that might produce more or less distinctive and consistent vocabularies, and they are generally fairly easy to run, but you have limited control over the results. Actually, the Fantasy Name Generator has a more advanced text interface. It’s more complicated than just running the presets and less configurable than WordBuilder, but it does make a good middle-of-the-road solution to wordgen.

If you can get Hypercard stacks to run on your computer(good luck) then Rob Prior’s MegaLinguist was uniquely marvelous. Keyword: unique. It wasn’t just a word generator, but it could do a fairly simple statistical analysis on a user-entered corpus of words and generate similar words based on that analysis. The ability to analyze an existing corpus and generate random words based on that analysis seems like it would be a pretty high priority among conlangers, but this kind of utility is surprisingly hard to find.

One utility which might serve the purpose is Chris Pound’s language confluxer, lc. The direct results of lc are gibberish, but if you pipe the results into prop(for proper names) or fix(for other words), the results can be gratifying. For example:

./lc -100 sumerian.txt | ./fix > out.txt

Will generate a list of 100 “words” based on the contents of sumerian.txt, pipe them into the fix utility and then place the results in the file named out.txt. Lemon-squeezy! You could also use prop to generate proper names as the following, more generalized example indicates:

./lc -[#] [data file] | ./prop > [output file]

If you’re just interested in the statistics of the data file then you can run lc with the -s flag:

./lc -s [data file] > [statistics file]

The resulting statistics file will be a series of lines beginning with a pair of characters, a colon and a series of characters. For example:


would mean that in the data file the pair of characters, “ab,” occured nine times and was followed four times by an s, twice by an l, twice by an o, and once by an r. This information could be useful in creating a script for WordBuilder or some other word generator.

The data file is simply a whitespace-separated plaintext list of words. Namelists from original Civilization could prove useful… At present lc is not only the best but the only tool I’ve been able to find for the sort of word-generation-by-analysis-of-existing-corpus that MegaLinguist did so well.

Once we have generated a good long collection of words and picked out the ones we like(trust me on this: no matter how good your generator or script is… you won’t want to use all the words generated), now comes the time to place them on your map. That is, to determine the appropriate locations for the names. With both my Shtakamashkan and Kazh maps(the latter was for a competition on Cartographer’s Guild, sadly even I didn’t vote for my map), I simply used one generator for all of the names and placed them willy-nilly wherever I thought they fit. Another way, that gives more of a sense of variety in the cultures of a world or region, is to use different parameters for different regions or “countries.”

Another tack I’d like to try at some point is to imagine a set of seed points on the map. For each of these seed points, you would generate a pretty large word list. Then you model migration of people from these seed points, periodically modifying the sounds of the original seed languages using a program like Mark Rosenfelder’s SCA2. With enough time and effort you could end up with dozens or even hundreds(!?!) of distinct “languages” from a handful of proto-languages. All with clear genetic relationships to each other. Seems like it could be interesting.

Now, admittedly, I need to use the word, “languages,” with a bit of care here. Hence, the quote marks… What you have is, at best, a collection of words(or letter clusters, at worst) with potentially interesting similarities and differences across regions. This doesn’t preclude building actual constructed languages on top of the locally-modified word lists. Some of the more interesting places might be worth giving that treatment later, but it is a time consuming process in any case. Getting a completed map up, at least for reference, would be facilitated by using simple naming languages. Later on when it turns out your city named Ga Badrash is grammatically incorrect for the language you develop, you can change the name to Gali Badraska. It’s not like you can’t find any old maps where Beijing is referred to as Peking for instance.

Good luck and good worldbuilding,

The Astrographer

Posted in Mapping, World Building | Tagged , , , , , | 1 Comment

Map of the World I Rendered in Blender

We'll start with a desktop-suitable version of the planet.

We’ll start with a desktop-suitable version of the planet.

So I decided to go back and make a larger version of the planet image I created previously for use as a desktop. Basically that involved just resizing an slightly recomposing the scene and rendering it. To add a bit of a half-assed atmospheric effect, I added an Outer Glow to the transparent planet layer. I think this still might have some value in combination with a good set of cloud and atmosphere spheres in Blender.

Next, I decided to create a flat map-style image suitable for gis tools and the like. The Cycles renderer is not terribly fast on my computer. The panoramic camera only makes the situation worse. So I made some modifications to the settings to try to reduce rendering time considerably. Hopefully without compromising the quality of the map image.

I have zipped up all(or most) of the blend files I used on this here

I set the resolution to 2048×1024, but any 2:1 aspect ratio will work(the X and Y values labelled as Aspect Ratio control the shape of the individual pixels; 1:1 square pixels are perfect). Output is an 8-bit RGB PNG with no compression. Samples were reduced to only 32 with no speckling I could see. It could probably be further reduced. I went with Direct Light for light paths. I’ve limited Light bounces to one for diffuse and glossy and zero for transmission, volume and transparency. Viewport is Static BVH for “faster render”, but I don’t think that will effect rendering outside of the viewport. Under Acceleration structure, Use Spatial Splits is checked for longer build time but faster renders. The build time at the start seems pretty short in any case, so it shouldn’t cause problems and in any case this should make larger resolutions a lot quicker.

I also made some changes to the node graph for shading.First, I disconnected the Bump node. It had no visible effect with the light coming from right in the center where the camera also resided. I also tried to take out the Glossy BSDF shader, but although the specular effects weren’t visible I found the water areas to be too dark.

The comparison with my previous PlanetCell map render is not perfect as the shader tree for PlanetCell was quite complex, and I was using a lot of OSL script nodes, which slow things down, but my previous PlanetCell renders took… HOOOUUURRRSSS. No. Literally. They took hours. Nearly an all day render. My render for this only took 31 minutes and 42.88 seconds.

As a further experiment, I plugged the Color Ramp into a Brightness and Contrast node and connected that and the original ramp into an RGB Mix node with the Greater Than and Multiply nodes used to drive the Mix Shader node plugged into the face node. Although I’d hoped that this would result in significantly faster renders, it took 32 minutes and 53.42 seconds to render. Basically a wash.

For a third experiment, I also reduced the Glossy light bounces to zero. That… also seemed to slow things down. It took 33 minutes 28.07 seconds to render. I don’t understand that at all, but whatever.

Finally, I simply lightened up the water colors in the Ramp itself and reduced the render light samples to eight. This time the render time was 7 minutes, 35.81 seconds! I also increased the tile size to 128×128. I wonder if there is some overhead in rendering and moving tiles? Let’s test that.

No other changes. I’m just increasing the tile size to 1024×1024. That separates the rendered image into two sections, so that my dual-core CPU can handle them in parallel. Rendering time 6 minutes and 3.62 seconds. Apparently, there is some benefit to the larger tiles, but apparently additional samples are expensive. Still pretty quick and decent render.



Now, we try a larger render. 4096×2048 with a tile size of 2048×2048. That took 21 minutes 49.22 seconds.

4096x2048 8 samples

4096×2048 8 samples

I tried an 8192×4096 render with a tile size still of 2048×2048 with 5 and 8 samples. The 5 sample version took 58 minutes 10.82 seconds to render, but it was much too grainy. The 8 sample rendering took 1 hour 33 minutes and 27.97 seconds, but still seemed a bit grainy.

Finally, I rendered an 8192×4096 render at 10 samples. This took me 2 hours 1.16 seconds to render. The improvement to graininess was minimal. I think I’d stick with 8 samples in the future, because the time cost was just too high for the negligible improvement.

8192x4096 10 samples

8192×4096 10 samples

Next week, I’m going to Disneyland on my daughter’s Make*A*Wish trip, so I won’t be able to maintain the pace of postings I have lately. After that, I’ll get right back on the horse.
The Astrographer

Posted in Mapping, Planetary Stuff, World Building | Tagged , , , , , , , | Leave a comment