Java-Gaming.org Hi !
 Featured games (83) games approved by the League of Dukes Games in Showcase (582) Games in Android Showcase (163) games submitted by our members Games in WIP (633) games currently in development
 News: Read the Java Gaming Resources, or peek at the official Java tutorials
Pages: [1]
 ignore  |  Print
 Noise (bandpassed white)  (Read 42240 times) 0 Members and 1 Guest are viewing this topic.
Wiki Duke

?

 « Posted 2012-08-10 08:42:04 »

Note: you are watching revision 16 of this wiki entry. ( view plain diff )
Main/Procedural content/Noise

Overview

The purpose of this page is to give an overview of how noise functions in this family work and the various tradeoffs that can be made in implementation choices.  How to use noise to create and modify content is a huge topic...BLAH, BLAH add links.

Introduction

This family of noise functions are incredibly useful tools for creating and modifying content.  According to CG industry lore it was informally observed in the 90s that "90% of 3D rendering time is spent in shading, and 90% of that time is spent computing Perlin (gradient) noise".  Regardless of the truth of this observation, this family of noise functions are certainly one of the most important techniques not only in procedurally generated content but in CG as a whole.  Increases in CPU speed and the relatively new addition of GPU computation allow for runtime evaluation of the cheaper of these methods in realtime graphics.

Attempting to give any detailed descriptions of how to "use" noise functions to create or modify content is well beyond the scope of any short description.  The goal here is to outline some basics of core generation techniques and to provide links to more detailed information in specific areas of interest.

For the local discussion, we'll assume that noise accepts floating point input for a sample coordinate and returns a floating point value (usually either on [0,1] or [-1,1]).  It will provide some sketches of 2D implementations to (hopefully) aid in understanding.

Noise functions are evaluated in some number of dimensions (typically 1,2,3 or 4).  This is simply to say that you provide some input coordinate and noise returns the corresponding fixed value at that position, just like any other multi-dimensional function.  From a signal processing perspective this family can be described as an attempt to approximate band-pass filtering of white noise.  Perhaps a simpler description would be that they are attempts at coherent pseudo-random number generators (PRNG).

Regular PRNGs attempt to create a fixed sequence (from some initial state data...frequently termed the 'seed') of values that appear to be statistically independent.  White noise can be created from a PRNG as in the following sketch (in 2D):

 1  2  3  4  5  6 `float eval(float x, float y){  long seed = mix(x,y);         // map the input coordinate to a seed value  prng.setSeed(seed);           // set the PRNGs seed to the mix  return prng.nextFloat();      // return the result}`

Unfortunately raw white noise is of very little use.  If you were to create a 2D texture from white noise, regardless of how you walk through the 'noise' function the result would be virtually identical.  The result would be like what you'd see on an old broadcast TV tuned to a channel without a signal.  What's really needed to be useful are random values that are coherent:  which roughly says that sample points far apart are like PRNG values, appear to be independent, and the set of all sample points close to one another vary continuously (or smoothly in less formal speak).

Value noise

Value noise is the one of the original attempts at this style of noise generation.  It is very often miscalled Perlin noise.  Evaluation is very cheap, but it burden with serious defects and is very poor at band-pass filtering.  Quality can be improved, but even the most basic improvements make it more expensive than gradient noise.  So a general guideline for using this technique is to only use a very cheap version and only when some existing content can be minorly modified by one or two evaluations.

Value noise is computed by forming a regular grid, computing random values at each vertex and blending the values to produce a result.  Sketch in 2D:

 1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  16  17  18  19  20 `float eval(float x, float y){  // lower left hand corner of cell containing (x,y)  int ix   = (int)Math.floor(x);  int iy   = (int)Math.floor(y);  // offset into 'cell' of (x,y). dx & dy are on [0,1)  float dx = x - ix;  float dy = y - iy;  // generate a random value for each vertex of the cell  // based on its integer coordinate.  float r00 = mix(ix,   iy);  float r10 = mix(ix+1, iy);  float r01 = mix(ix,   iy+1);  float r11 = mix(ix+1, iy+1);  // use some interpolation technique to get the sample value.  return blend(r00,r10,r01,r11,dx,dy);}`

So to compute value noise in 'n' dimensions, the work required is related to n2 (1D = line segment or 2 vertices, 2D = square or 4 verts, 3D = cube and 8, etc).  The problems with value noise stem from the fact that at each evaluation point, the result only depends on blended data interior the cell that its within.  This results in sample points close to one another, but in different cells, to not vary continuously.  This results in very obvious defects along cell boundaries.  Early attempts to fix this major problem included visiting further away cells and using more complex blending functions...which drastically increase complexity.  The introduction of gradient noise made these solutions obsolete.

References

Created in 1983 by Ken Perlin, this Oscar award winning technique is a clever way to minorly modify value noise to drastically improve the output quality.  Usually when one is (correctly) calling a noise function "Perlin" noise, this is the technique being discussed.  The clever addition is to choose a vector associated with each vertex (gradient vector).  Then to calculate the vector from the vertex to the sample point.  The dot product between these two vectors gives a weighting to modify the value at each vertex.  It was quickly noted that this last step is not really useful and that the dot product itself is more than a sufficiently random value (dropping one multiply).  Next the dot product results at the vertices are interpolated to generate a final result.  Notice that like value noise, the output entirely depends on the evaluation of a single cell and has the same complexity in the number of dimensions.  The difference here is that the random vector helps to smooth out values across neighboring cells...much in the same way that Gouraud shading improves over flat shading.  Sketch in 2D:

 1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30  31  32  33  34  35  36 `float eval(float x, float y) {  // lower left hand corner of cell containing (x,y)  int ix = (int)Math.floor(x);  int iy = (int)Math.floor(y);  // offset into 'cell' of (x,y). dx & dy are on [0,1)  x -= ix;  y -= iy;  // generate a random value for each vertex of the cell  // based on its integer coordinate.  int   h00 = mix(ix,   iy);  int   h10 = mix(ix+1, iy);  int   h01 = mix(ix,   iy+1);  int   h11 = mix(ix+1, iy+1);  // some function that uses the random number to 'dot'  // against a random vector. Not the '-1' compared to the '+1'  // above.  float r00 = dotRandVect(h00, x,   y);  float r10 = dotRandVect(h10, x-1, y);  float r01 = dotRandVect(h01, x,   y-1);  float r11 = dotRandVect(h11, x-1, y-1);  // convert the offest into the cell into a weighting factor  // so-called ease or s-curve      x = weight(x);  y = weight(y);  // blend to get the final result  float xb = lerp(x, r00, r10);  float xt = lerp(x, r01, r11);  return lerp(y, xb, xt);}`

Note that there have been numerous improvements made to gradient noise over the years, so some references may be referring to older versions. And, of course, authors may make minor tweaks (for better or worse) to their specific implementation.

Variants of note:
• Originally the vectors were randomly generated unit vectors. Creating these on the fly is rather expensive. In days of yore a precomputed table of random vectors was an option (less so today given memory access overhead).  The reduced number of random vectors introduces some very minor directional defects.

Perlin later noted that using a small set of vectors (all the permutations of vector components of zero and +/-one, but not all zero) drastically reduced computational cost.  Specifically this drops 1 multiply per dimension per vertex (2*4 in 2D, 3*8 in 3D).  This significantly increases directional defects (SEE: Defects below).  Some GPU implementations use more mathematically complex selection to address this issue.
• Two ease functions: Perlin uses a weight function which he terms either ease or s-curve.  The original function was: 3t2-2t3. This function is continuous but its derivative is not.  This was later replaced by the more expensive: 10t3-15t4 + 6t5 which is C2 continuous.

References Yeah...add tons of stuff here

Perlin simplex noise

In 2002 Ken Perlin introduced a new noise function that is a drastic change in direction.  The purpose was to create a function which could be cheaply implemented in hardware and addresses some of the defects in gradient noise.  Although designed for hardware it is a better fit for modern CPU and GPU architectures.

The first major change is how cells are formed. Instead of regular breaking up of space, the input is skewed into a simplex (SEE: Stefan Gustavon's paper for details).  This drops the number of vertices needed from n2 to (n+1), where 'n' is the number of dimensions.

The second major change is instead of calculating values and each vertex and blending the results to compute the final result, the result is instead a summation of contributions from each.  This lowers the dependency chain and can increase throughput.  For example in 2D, in value and gradient noise, one might first blend in "X": the top edge, then the bottom (these two are independent), then take those results and blend in "Y" to get a final result.  In 2D simplex noise, the contribution from the three vertices are independently computed and summed to produce the result.

As a rule of thumb, if you need noise (of this variety) in three or four dimensions, then simplex noise is the way to go.

References

Defects

Noise is one of those area's where science and art collide.  As such the various listed defects only really have meaning if they have a negative impact on the desired result.
• hash function:
• aligned cell structure:

The cheapest way to attempt to hide these defects is to insure that the grid structures of multiple noise evaluations are not aligned with one another. BLAH, BLAH

Isotropic and anisotropic

Isotropic is math-speak for uniform in all directions and anisotropic is, well, not...the thing in question isn't uniform in all directions.  The goal of all the above noise functions is to be isotropic.  All, however, have directional defects which make this not quite true.  Getting anisotropic results from isotropic noise simply involves applying a non-uniform scale factor when sampling.

Periodic noise

The sketches above are for noise functions without a period.  It is commonly desirable to have noise be periodic, or in other words to wrap at specific boundaries.  Well, there's good and there's bad news.  The first bad news is that most "methods" to make noise periodic are very expensive and don't really work (SEE: Matt Zucker's FAQ above for an example).  The first good news is that it's simple to perform cheaply, assuming that wrapping at integer boundaries and in particular power-of-two boundaries is an acceptable limitation.  Making a minor modification to the vertex computation allows this to happen..masking in the case of power-of-two and "faking" an integer modulo in other cases.  This requires modifying the base noise function (having special cases, dynamic code generations, etc.)  Another option is to use a noise function in (potentially) a higher dimension higher than desired and to "walk" that space in such a way that you reach the same coordinate at boundary points.  This later happens someway naturally if computation is performed at runtime on the GPU.  As an example to apply noise to a sphere (or any other 3D object), one simply samples a 3D noise function at a scaled and/or translated coordinate of the object's surface (or 4D function if the noise is to be animated in time).

Optimizations

Noise functions tend to be expensive as many calls are usually required to create a specific effect.  As such speed is pretty important when computated at runtime.  Given the nature of noise it is a very good candidate for running on the GPU...blah, blah

Blah, Blah.

Other noise functions

There are many other noise functions, many of which are too complex to be evaluated at runtime, but may have game usage for pre-generated content. BLAH BLAH:

• Anisotropic noise
• Gabor noise: not the same family, but can generate similar results.
• Sparse convolution noise: realtime variants potentially reasonable on the GPU
• Wavelet noise

References
Roquen
 « Reply #1 - Posted 2012-08-10 18:17:29 »

OK.  Does my first pass at the theory and why I've put it there make sense to someone that knows zero about signal processing?
erikd

JGO Ninja

Medals: 16
Projects: 4
Exp: 14 years

Maximumisness

 « Reply #2 - Posted 2012-08-14 18:36:00 »

I'm getting the general gist of it, but I have to admit I know a thing or 2 about signal processing.
But my general feeling about the article is that it kind of covers too many things at once at a purely theoretical level without being very practical.
Perhaps you could try targeting it to a developer with a specific need, for example procedural texture generation, hight-map generation, or some other procedural content generation. And then explaining why a certain noise algorithm would make sense in that particular case.

I'm trying to deride your article (in fact I'm very interested in the subject), but I feel it covers too many areas to be useful in just one wiki article.

Junior Devvie

 « Reply #3 - Posted 2012-08-18 19:37:14 »

I think people that don't already know the subject will be lost. What's the purpose of bandpass filtering? Isn't the goal of certain noise algorithms to achieve a subjective aesthetic effect?
pjt33

« JGO Spiffy Duke »

Medals: 40
Projects: 4
Exp: 7 years

 « Reply #4 - Posted 2012-08-18 22:59:13 »

To someone who knows a bit about signal processing, it raises a number of questions. E.g. did you intend your description of decomposition in terms of basis functions to be broad enough to include Taylor expansion? Would it be worth defining "signal"? Does it make sense to talk about Fourier analysis of non-periodic functions in an introduction to noise?

Someone who doesn't know anything about signal processing is guaranteed to not know what you mean by the frequency domain. They may also pick up on the Gibbs phenomenon in the pictures about creating a square wave, and wonder whether it contradicts what you're saying. And they won't have a clue what "band-pass filtered" means.
Roquen
 « Reply #5 - Posted 2012-08-19 13:10:17 »

Well talking in a hand waving kinda way about signal processing is tricky.  I guess the more important question is if it's even worth talking about at all?
keldon85

Senior Devvie

Medals: 1

 « Reply #6 - Posted 2012-08-19 15:46:27 »

I've seen some sensational shaders making use of perlin noise to make some really convincing wood textures, and for scene generation it helps to have a little taste for different ways of producing controlled modulation.

Though it would help if we could arrange the structure a little to be more helpful in some way because this is a very useful topic to be covered.

philfrei
 « Reply #7 - Posted 2012-08-19 21:15:48 »

I'm trying to get better acquainted with using noise in textures right now, grappling with it conceptually, having just had my first working experience with calling Simplex noise to assemble a cloudy texture.

I'm not at all clear that getting into Fourier analysis is helpful. I can see where using 'harmonics' can make the coding neater, and it seems to work well with the mathematics of fractals, but it doesn't seem to be entirely necessary. One can add noise that has energy at frequencies that are unrelated to the base frequency with no problem. Visual textures are not like sound waves, where one deals with the prevalence nodes and anti-nodes and standing waves in the "real world," and the ear and hearing portions of the brain has evolved to make use of data in this form.

So, is it simpler, instead, to describe noise as having components that are at various periods, and not worry about the Fourier analysis, at least, at the "beginner" level? Or are techniques to analyze textures to determine their strongest component frequencies in use and an important part of creating textures?

I'm thinking, for a dimension, given a length L and a value "n" along that length, a "basic" unit of noise might be of length n/L. Since n can go from 0 to L, the result of this fraction is 0 to 1. One can multiply this value by different factors to get different degrees of scaling. Obtaining noise with (n/L * K) will be K times more detailed than noise at n/L.

But we are free to make K whatever we want. K can be a float or double. It doesn't have to be an integer or a progression of integers.

(We might also talk about how to "relate" the periodicity of one type of noise to another via a scaling factor that is applied to the n/L (O to 1) results of the different noise generation techniques? Maybe this is already done?)

Then, there is total latitude with what we do with the output noise values (which range from -1 to 1), whether to sum them or lerp them, or use them in trig functions. It seems wide open, as long as the function results in a legal Color value for a pixel.

(It occurs to me, one could also talk about a more concrete value the periodic nature of noise by finding a number of pixels that corresponds to an average of one swing in the random number. But I think I am getting into fuzzy thinking, as I don't know how to describe a "period" of randomness, and the way in which we relate the numbers to pixels on the screen is so fluid.)

As I said, I'm a beginner with using noise, and am happy to be corrected on any point.

"We all secretly believe we are right about everything and, by extension, we are all wrong." W. Storr, The Unpersuadables
Roquen
 « Reply #8 - Posted 2012-08-20 05:51:21 »

Doing octaves is exactly construction that's obvious in the frequency domain and one reason why I thought this might be useful.  I'm thinking about a complete different track that describes as coherent pseudo-random numbers and walking though the historic progression of how they work.  Detailed usage is a ton of work and was thinking that provide a bunch of links would be reasonable for a first pass.  (plus I'm too lazy to make pictures)
keldon85

Senior Devvie

Medals: 1

 « Reply #9 - Posted 2012-08-20 22:20:55 »

Hmm, when generating textures like clouds and terrain it helps to know a little about the effects of noise colouring, distortion and how it can be fun to mess around with to produce different results.

And thinking back to clouds, isn't the very reason you see clouds down to the low frequency bias, with the overtones creating the fuzziness.

I think it would help to show the differences visually though, or even better yet, produce an app to demonstrate it. I learned a lot just by tweaking my 2d noise generator. This could be the difference between generating really mundane levels or elaborate worlds that feel like they've been really well designed  IMO.

I agree with what philfrel said about having components at various periods, and experimentation is key. When I was at uni I remember toying around rending series upon series of sine waves, some directional, some radial, until (*I swear*) it looked like plasma. Of course that wasn't the memorable part, the memorable part was when I made a few small modifications and ... err ... it stopped looking awesome and had to investigate and figure out what on Earth was going on Still don't know how I did it

In terms of good, I think it's key to remember that these random values can be feeding into game behaviour. This could be what gives flavour to your map generation algorithm. The more applicable and relative to games the examples are the more that people will digest and be able to use it.

Roquen
 « Reply #10 - Posted 2012-08-21 09:52:06 »

No doubt experience is all important for creating effects and the theory is of marginal use.  My thinking is more geared toward choosing what set of base generators to use.  Precomputed effects is easy:  improved gradient if you want to quickly bang stuff out based on other peoples work (as pretty much everything is written to gradient noise) and/or simplex noise.  The 'better' noise methods are too expensive for fast turn around to be useful IHMO.  The trickier part is for runtime generated stuff.
Roquen
 « Reply #11 - Posted 2012-08-23 13:27:44 »

OK. I made a first pass at a second pass.  Any better or still wankery?
philfrei
 « Reply #12 - Posted 2012-08-23 17:48:52 »

Many improvements!

"We all secretly believe we are right about everything and, by extension, we are all wrong." W. Storr, The Unpersuadables
Roquen
 « Reply #13 - Posted 2012-09-27 14:59:16 »

Fixed some typo's.  Completed "brief" and added a sketch for gradient noise.
philfrei
 « Reply #14 - Posted 2012-09-27 18:40:00 »

Would love to get some more feedback as to what can be done to make the visualizer I started [http://www.java-gaming.org/topics/simplex-noise-experiments-towards-procedural-generation/27163/view.html] something you'd consider adding as a link on the main page of this wiki.

P.S., I'm seriously looking at "open sourcing" the project on GitHub, making the emphasis more on helping devs write and test a wider range of textures. (Just figured out "perspective" but haven't integrated it yet.)

"We all secretly believe we are right about everything and, by extension, we are all wrong." W. Storr, The Unpersuadables
matheus23

JGO Kernel

Medals: 121
Projects: 3

You think about my Avatar right now!

 « Reply #15 - Posted 2012-09-27 18:44:52 »

P.S., I'm seriously looking at "open sourcing" the project on GitHub, making the emphasis more on helping devs write and test a wider range of textures. (Just figured out "perspective" but haven't integrated it yet.)
This is almost always a very good idea! Do this, (I'd be intrested too but mind licensing!)

See my:
My development Blog:     | Or look at my RPG | Or simply my coding
http://matheusdev.tumblr.comRuins of Revenge  |      On Github
Roquen
 « Reply #16 - Posted 2012-09-27 19:52:34 »

Hey, it's a wiki...do it yourself!  Seriously I was thinking this is getting about as long as is reasonable and that talking about basics of using noise should be on another page and have code-snippets like your tutorial.  There no reason why your tool shouldn't be linked from both.
philfrei
 « Reply #17 - Posted 2012-09-27 21:38:53 »

Doh! Looky there. A "modify" button on the first post.

"We all secretly believe we are right about everything and, by extension, we are all wrong." W. Storr, The Unpersuadables
Roquen
 « Reply #18 - Posted 2013-04-05 15:36:37 »

Tossed together a quick WebGL demo (link in overview).
matheus23

JGO Kernel

Medals: 121
Projects: 3

You think about my Avatar right now!

 « Reply #19 - Posted 2013-12-27 11:55:10 »

Quote
So to compute value noise in 'n' dimensions, the work required is related to n2 (1D = line segment or 2 vertices, 2D = square or 4 verts, 3D = cube and 8, etc)

Shouldn't it be 2^n? I'm not sure that's why I'm asking...

 1  2  3  4  5 `1D = 2^1 = 2 vertices2D = 2^2 = 4 vertices3D = 2^3 = 8 vertices4D = 2^4 = 16 vertices...`

Yeah. Pretty sure now.

See my:
My development Blog:     | Or look at my RPG | Or simply my coding
http://matheusdev.tumblr.comRuins of Revenge  |      On Github
Roquen
 « Reply #20 - Posted 2013-12-27 12:52:04 »

I'm seeing 2^n in what you've quoted...check your browser and/or the source that you're seeing.  I see that it's wrong in the simplex noise part though (reading n^2)...sigh.
matheus23

JGO Kernel

Medals: 121
Projects: 3

You think about my Avatar right now!

 « Reply #21 - Posted 2013-12-27 12:52:36 »

I'm seeing 2^n in what you've quoted...check your browser and/or the source that you're seeing.  I see that it's wrong in the simplex noise part though (reading n^2)...sigh.

Yeah, I've modified it in the first part already ^^

See my:
My development Blog:     | Or look at my RPG | Or simply my coding
http://matheusdev.tumblr.comRuins of Revenge  |      On Github
Roquen
 « Reply #22 - Posted 2013-12-27 13:59:25 »

Wait!  You...fixed....it...and Wait!  You corrected the second error as well?  Someone understands the concept of a wiki!  I'm so happy I could cry.  (nice catch).
matheus23

JGO Kernel

Medals: 121
Projects: 3

You think about my Avatar right now!

 « Reply #23 - Posted 2013-12-27 14:09:25 »

Wait!  You...fixed....it...and Wait!  You corrected the second error as well?  Someone understands the concept of a wiki!  I'm so happy I could cry.  (nice catch).

See my:
My development Blog:     | Or look at my RPG | Or simply my coding
http://matheusdev.tumblr.comRuins of Revenge  |      On Github
Pages: [1]
 ignore  |  Print

 Ralphanese (21 views) 2015-05-29 13:07:55 Gibbo3771 (17 views) 2015-05-29 06:54:17 Ralphanese (23 views) 2015-05-28 16:52:52 theagentd (31 views) 2015-05-27 22:21:17 theagentd (38 views) 2015-05-27 22:21:08 CopyableCougar4 (27 views) 2015-05-27 19:24:50 MrMapcom (24 views) 2015-05-23 20:26:16 MrMapcom (33 views) 2015-05-23 20:23:34 Waterwolf (39 views) 2015-05-20 15:01:45 chrislo27 (46 views) 2015-05-20 03:42:21
 Riven 27x Rayvolution 25x theagentd 23x BurntPizza 20x Spasi 17x KevinWorkman 16x orangepascal 16x Drenius 15x ra4king 13x opiop65 12x DavidBVal 12x princec 12x Husk 11x EgonOlsen 10x kevglass 9x Slyth2727 8x
 Intersection Methodsby Roquen2015-05-29 08:19:33List of Learning Resources2015-05-05 10:20:32How to: JGO Wikiby Mac702015-02-17 20:56:162D Dynamic Lighting2015-01-01 20:25:42How do I start Java Game Development?by gouessej2014-12-27 19:41:21Resources for WIP gamesby kpars2014-12-18 10:26:14Understanding relations between setOrigin, setScale and setPosition in libGdx2014-10-09 22:35:00Definite guide to supporting multiple device resolutions on Android (2014)2014-10-02 22:36:02
 java-gaming.org is not responsible for the content posted by its members, including references to external websites, and other references that may or may not have a relation with our primarily gaming and game production oriented community. inquiries and complaints can be sent via email to the info‑account of the company managing the website of java‑gaming.org