ChatGPT解决这个技术问题 Extra ChatGPT

Is uninitialized local variable the fastest random number generator?

I know the uninitialized local variable is undefined behaviour(UB), and also the value may have trap representations which may affect further operation, but sometimes I want to use the random number only for visual representation and will not further use them in other part of program, for example, set something with random color in a visual effect, for example:

void updateEffect(){
    for(int i=0;i<1000;i++){
        int r;
        int g;
        int b;
        star[i].setColor(r%255,g%255,b%255);
        bool isVisible;
        star[i].setVisible(isVisible);
    }
}

is it that faster than

void updateEffect(){
    for(int i=0;i<1000;i++){
        star[i].setColor(rand()%255,rand()%255,rand()%255);
        star[i].setVisible(rand()%2==0?true:false);
    }
}

and also faster than other random number generator?

+1 This is a perfectly legitimate question. It's true that in practice, uninitialized values might be kindof random. The fact that they aren't particularly and that it's UB doesn't make asking that bad.
@imallett: Absolutely. This is a good question, and at least one old Z80 (Amstrad / ZX Spectrum) game in times past used its program as the data to set up its terrain. So there are even precedents. Can't do that these days. Modern operating systems take away all the fun.
Surely the main problem is that it isn't random.
In fact, there's an example of a uninitialized variable being used as a random value, see the Debian RNG disaster (Example 4 in this article).
In practice - and believe me, I do a lot of debugging on various architectures - your solution can do two things: either read uninitialized registers or uninitialized memory. Now while "uninitialized" means random in a certain manner, in practice it will most likely contain a) zeroes, b) repeating or consistent values (in case of reading memory formerly occupied by digital media) or c) consistent garbage with a limited value set (in case of reading memory formerly occupied by encoded digital data). None of those are real entropy sources.

J
Jonathan Leffler

As others have noted, this is Undefined Behavior (UB).

In practice, it will (probably) actually (kind of) work. Reading from an uninitialized register on x86[-64] architectures will indeed produce garbage results, and probably won't do anything bad (as opposed to e.g. Itanium, where registers can be flagged as invalid, so that reads propagate errors like NaN).

There are two main problems though:

It won't be particularly random. In this case, you're reading from the stack, so you'll get whatever was there previously. Which might be effectively random, completely structured, the password you entered ten minutes ago, or your grandmother's cookie recipe. It's Bad (capital 'B') practice to let things like this creep into your code. Technically, the compiler could insert reformat_hdd(); every time you read an undefined variable. It won't, but you shouldn't do it anyway. Don't do unsafe things. The fewer exceptions you make, the safer you are from accidental mistakes all the time.

The more pressing issue with UB is that it makes your entire program's behavior undefined. Modern compilers can use this to elide huge swaths of your code or even go back in time. Playing with UB is like a Victorian engineer dismantling a live nuclear reactor. There's a zillion things to go wrong, and you probably won't know half of the underlying principles or implemented technology. It might be okay, but you still shouldn't let it happen. Look at the other nice answers for details.

Also, I'd fire you.


@Potatoswatter: Itanium registers can contain NaT (Not a Thing) which in effect is an "uninitialized register". On Itanium, reading from a register when you haven't written to it may abort your program (read more on it here: blogs.msdn.com/b/oldnewthing/archive/2004/01/19/60162.aspx). So there is a good reason why reading uninitialized values is undefined behaviour. It's also probably one reason why Itanium is not very popular :)
I really object to the "it kind of works" notion. Even if it was true today, which it is not, it might change at any time due to more aggressive compilers. The compiler can replace any read with unreachable() and delete half of your program. This does happen in practice as well. This behavior did completely neutralize the RNG in some Linux distro I believe.; Most answers in this question seem to assume that an uninitialized value behaves like a value at all. That's false.
Also, I'd fire you seems a rather silly thing to say, assuming good practices this should be caught at code review, discussed and should never happen again. This should definitely be caught since we are using the correct warning flags, right?
@Michael Actually, it is. If a program has undefined behaviour at any point, the compiler can optimise your program in a way that affects code preceding that invoking the undefined behaviour. There are various articles and demonstrations of how mind boggling this can get Here's quite a good one: blogs.msdn.com/b/oldnewthing/archive/2014/06/27/10537746.aspx (which includes the bit in the standard that says all bets are off if any path in your program invokes UB)
This answer makes it sound as if "invoking undefined behavior is bad in theory, but it won't really hurt you much in practice". That's wrong. Collecting entropy from an expression that would cause UB can (and probably will) cause all the previously collected entropy to be lost. This is a serious hazard.
S
Shafik Yaghmour

Let me say this clearly: we do not invoke undefined behavior in our programs. It is never ever a good idea, period. There are rare exceptions to this rule; for example, if you are a library implementer implementing offsetof. If your case falls under such an exception you likely know this already. In this case we know using uninitialized automatic variables is undefined behavior.

Compilers have become very aggressive with optimizations around undefined behavior and we can find many cases where undefined behavior has lead to security flaws. The most infamous case is probably the Linux kernel null pointer check removal which I mention in my answer to C++ compilation bug? where a compiler optimization around undefined behavior turned a finite loop into an infinite one.

We can read CERT's Dangerous Optimizations and the Loss of Causality (video) which says, amongst other things:

Increasingly, compiler writers are taking advantage of undefined behaviors in the C and C++ programming languages to improve optimizations. Frequently, these optimizations are interfering with the ability of developers to perform cause-effect analysis on their source code, that is, analyzing the dependence of downstream results on prior results. Consequently, these optimizations are eliminating causality in software and are increasing the probability of software faults, defects, and vulnerabilities.

Specifically with respect to indeterminate values, the C standard defect report 451: Instability of uninitialized automatic variables makes for some interesting reading. It has not been resolved yet but introduces the concept of wobbly values which means the indeterminatness of a value may propagate through the program and can have different indeterminate values at different points in the program.

I don't know of any examples where this happens but at this point we can't rule it out.

Real examples, not the result you expect

You are unlikely to get random values. A compiler could optimize the away the loop altogether. For example, with this simplified case:

void updateEffect(int  arr[20]){
    for(int i=0;i<20;i++){
        int r ;    
        arr[i] = r ;
    }
}

clang optimizes it away (see it live):

updateEffect(int*):                     # @updateEffect(int*)
    retq

or perhaps get all zeros, as with this modified case:

void updateEffect(int  arr[20]){
    for(int i=0;i<20;i++){
        int r ;    
        arr[i] = r%255 ;
    }
}

see it live:

updateEffect(int*):                     # @updateEffect(int*)
    xorps   %xmm0, %xmm0
    movups  %xmm0, 64(%rdi)
    movups  %xmm0, 48(%rdi)
    movups  %xmm0, 32(%rdi)
    movups  %xmm0, 16(%rdi)
    movups  %xmm0, (%rdi)
    retq

Both of these cases are perfectly acceptable forms of undefined behavior.

Note, if we are on an Itanium we could end up with a trap value:

[...]if the register happens to hold a special not-a-thing value, reading the register traps except for a few instructions[...]

Other important notes

It is interesting to note the variance between gcc and clang noted in the UB Canaries project over how willing they are to take advantage of undefined behavior with respect to uninitialized memory. The article notes (emphasis mine):

Of course we need to be completely clear with ourselves that any such expectation has nothing to do with the language standard and everything to do with what a particular compiler happens to do, either because the providers of that compiler are unwilling to exploit that UB or just because they have not gotten around to exploiting it yet. When no real guarantee from the compiler provider exists, we like to say that as-yet unexploited UBs are time bombs: they’re waiting to go off next month or next year when the compiler gets a bit more aggressive.

As Matthieu M. points out What Every C Programmer Should Know About Undefined Behavior #2/3 is also relevant to this question. It says amongst other things (emphasis mine):

The important and scary thing to realize is that just about any optimization based on undefined behavior can start being triggered on buggy code at any time in the future. Inlining, loop unrolling, memory promotion and other optimizations will keep getting better, and a significant part of their reason for existing is to expose secondary optimizations like the ones above. To me, this is deeply dissatisfying, partially because the compiler inevitably ends up getting blamed, but also because it means that huge bodies of C code are land mines just waiting to explode.

For completeness sake I should probably mention that implementations can choose to make undefined behavior well defined, for example gcc allows type punning through unions while in C++ this seems like undefined behavior. If this is the case the implementation should document it and this will usually not be portable.


+(int)(PI/3) for the compiler output examples; a real-life example that UB is, well, UB.
Utilizing UB effectively used to be the trademark of an excellent hacker. This tradition has gone on for probably 50 years or more now. Unfortunately, computers are now required to minimize the effects of UB because of Bad People. I really enjoyed figuring out how to do cool things with UB machine code or port read/writes, etc I the 90s, when the OS wasn't as capable of protecting the user from themselves.
@sfdcfox if you were doing it in machine code / assembler, it wasn't undefined behavior (it may have been unconventional behavior).
If you have a specific assembly in mind, then use that and don't write uncompliant C. Then everyone will know you are using a specific non-portable trick. And it's not Bad People who mean you can't use UB, it's Intel etc doing their tricks on chip.
@500-InternalServerError because they may not be easily detectable or may not be detectable at all in the general case and therefore there would be no way to disallow them. Which is different then violations of the grammar which can be detected. We also have ill-formed and ill-formed no diagnostic required which in general separates poorly formed programs that could be detected in theory from those that in theory could not be reliably detected.
B
Bathsheba

No, it's terrible.

The behaviour of using an uninitialised variable is undefined in both C and C++, and it's very unlikely that such a scheme would have desirable statistical properties.

If you want a "quick and dirty" random number generator, then rand() is your best bet. In its implementation, all it does is a multiplication, an addition, and a modulus.

The fastest generator I know of requires you to use a uint32_t as the type of the pseudo-random variable I, and use

I = 1664525 * I + 1013904223

to generate successive values. You can choose any initial value of I (called the seed) that takes your fancy. Obviously you can code that inline. The standard-guaranteed wraparound of an unsigned type acts as the modulus. (The numeric constants are hand-picked by that remarkable scientific programmer Donald Knuth.)


The "linear congruential" generator that you present is good for simple applications, but only for non-cryptographic applications. It is possible to predict its behavior. See for example "Deciphering a linear congruential encryption" by Don Knuth himself (IEEE Transactions on Information Theory, Volume 31)
@Jay compared to an unitialized variable for quick and dirty? This is a much better solution.
rand() is not fit for purpose and should be entirely deprecated, in my opinion. These days you can download freely licensed and vastly superior random number generators (e.g Mersenne Twister) which are very nearly as fast with the greatest of ease so there's really no need to continue using the highly defective rand()
rand() has another terrible issue: it uses a kind of lock, called inside threads it slows down your code dramatically. At least, there's a reentrant version. And if you use C++11, the random API provides everything you need.
To be fair he didn't ask if it was a good random number generator. He asked if it was fast. Well, yes, it's probably the fasted., But the results will not be very random at all.
m
meaning-matters

Good question!

Undefined does not mean it's random. Think about it, the values you'd get in global uninitialized variables were left there by the system or your/other applications running. Depending what your system does with no longer used memory and/or what kind of values the system and applications generate, you may get:

Always the same. Be one of a small set of values. Get values in one or more small ranges. See many values dividable by 2/4/8 from pointers on 16/32/64-bit system ...

The values you'll get completely depend on which non-random values are left by the system and/or applications. So, indeed there will be some noise (unless your system wipes no longer used memory), but the value pool from which you'll draw will by no means be random.

Things get much worse for local variables because these come directly from the stack of your own program. There is a very good chance that your program will actually write these stack locations during the execution of other code. I estimate the chances for luck in this situation very low, and a 'random' code change you make tries this luck.

Read about randomness. As you'll see randomness is a very specific and hard to obtain property. It's a common mistake to think that if you just take something that's hard to track (like your suggestion) you'll get a random value.


... and that's leaving out all the compiler-optimizations which would completely gut that code.
6 ... You will get different "randomness" in Debug and Release. Undefined means you are doing it wrong.
Right. I'd abbreviate or summarize with "undefined" != "arbitrary" != "random". All these kinds of "unknownness" have different properties.
Global variables are guaranteed to have a defined value, whether explicitly initialized or not. This is definitely true in C++ and in C as well.
V
Viktor Toth

Many good answers, but allow me to add another and stress the point that in a deterministic computer, nothing is random. This is true for both the numbers produced by an pseudo-RNG and the seemingly "random" numbers found in areas of memory reserved for C/C++ local variables on the stack.

BUT... there is a crucial difference.

The numbers generated by a good pseudorandom generator have the properties that make them statistically similar to truly random draws. For instance, the distribution is uniform. The cycle length is long: you can get millions of random numbers before the cycle repeats itself. The sequence is not autocorrelated: for instance, you will not begin to see strange patterns emerge if you take every 2nd, 3rd, or 27th number, or if you look at specific digits in the generated numbers.

In contrast, the "random" numbers left behind on the stack have none of these properties. Their values and their apparent randomness depend entirely on how the program is constructed, how it is compiled, and how it is optimized by the compiler. By way of example, here is a variation of your idea as a self-contained program:

#include <stdio.h>

notrandom()
{
        int r, g, b;

        printf("R=%d, G=%d, B=%d", r&255, g&255, b&255);
}

int main(int argc, char *argv[])
{
        int i;
        for (i = 0; i < 10; i++)
        {
                notrandom();
                printf("\n");
        }

        return 0;
}

When I compile this code with GCC on a Linux machine and run it, it turns out to be rather unpleasantly deterministic:

R=0, G=19, B=0
R=130, G=16, B=255
R=130, G=16, B=255
R=130, G=16, B=255
R=130, G=16, B=255
R=130, G=16, B=255
R=130, G=16, B=255
R=130, G=16, B=255
R=130, G=16, B=255
R=130, G=16, B=255

If you looked at the compiled code with a disassembler, you could reconstruct what was going on, in detail. The first call to notrandom() used an area of the stack that was not used by this program previously; who knows what was in there. But after that call to notrandom(), there is a call to printf() (which the GCC compiler actually optimizes to a call to putchar(), but never mind) and that overwrites the stack. So the next and subsequent times, when notrandom() is called, the stack will contain stale data from the execution of putchar(), and since putchar() is always called with the same arguments, this stale data will always be the same, too.

So there is absolutely nothing random about this behavior, nor do the numbers obtained this way have any of the desirable properties of a well-written pseudorandom number generator. In fact, in most real-life scenarios, their values will be repetitive and highly correlated.

Indeed, as others, I would also seriously consider firing someone who tried to pass off this idea as a "high performance RNG".


“in a deterministic computer, nothing is random” — This isn’t actually true. Modern computers contain all kinds of sensors that allow you to produce true, unpredictable randomness without separate hardware generators. On a modern architecture, the values of /dev/random are often seeded from such hardware sources, and are in fact “quantum noise”, i.e. truly unpredictable in the best physical sense of the word.
But then, that's not a deterministic computer, is it? You are now relying on environmental input. In any case, this takes us well beyond the discussion of a conventional pseudo-RNG vs. "random" bits in uninitialized memory. Also... look at the description of /dev/random to appreciate just how far out of their way the implementers went to ensure that the random numbers are cryptographically secure... precisely because the input sources are not pure, uncorrelated quantum noise but rather, potentially highly correlated sensor readings with only a small degree of randomness. It's pretty slow, too.
6
6502

Undefined behavior means that the authors of compilers are free to ignore the problem because programmers will never have a right to complain whatever happens.

While in theory when entering UB land anything can happen (including a daemon flying off your nose) what normally means is that compiler authors just won't care and, for local variables, the value will be whatever is in the stack memory at that point.

This also means that often the content will be "strange" but fixed or slightly random or variable but with a clear evident pattern (e.g. increasing values at each iteration).

For sure you cannot expect it being a decent random generator.


M
Martijn

Undefined behaviour is undefined. It doesn't mean that you get an undefined value, it means that the the program can do anything and still meet the language specification.

A good optimizing compiler should take

void updateEffect(){
    for(int i=0;i<1000;i++){
        int r;
        int g;
        int b;
        star[i].setColor(r%255,g%255,b%255);
        bool isVisible;
        star[i].setVisible(isVisible);
    }
}

and compile it to a noop. This is certainly faster than any alternative. It has the downside that it will not do anything, but such is the downside of undefined behaviour.


A lot depends on whether the purpose of a compiler is help programmers produce executable files that meet domain requirements, or whether the purpose is to produce the most "efficient" executable whose behavior will be consistent with the minimal requirements of the the C Standard, without regard for whether such behavior will serve any useful purpose. With regard to the former objective, having code use some arbitrary initial values for r,g,b, or triggering a debugger trap if practical, would be more useful than turning the code into a nop. With regard to the latter objective...
...an optimal compiler should determine what inputs would cause the above method to execute, and eliminate any code which would only be relevant when such inputs are received.
@supercat Or its purpose could be C. to produce efficient executables in compliance with the Standard while helping the programmer find places where compliance may not be useful. Compilers can meet this compromise purpose by emitting more diagnostics than the Standard requires, such as GCC's -Wall -Wextra.
That the values are undefined does not mean that the behavior of surrounding code is undefined. No compiler should noop that function. The two function calls, whatever inputs they're given, absolutely MUST be called; the first MUST be called with three numbers between 0 and 255, and the second MUST be called with either a true or false value. A "good optimizing compiler" could optimize the function params to arbitrary static values, getting rid of the variables completely, but that's as far as it could go (well, unless the functions themselves could be reduced to noops on certain inputs).
"That the values are undefined does not mean that the behavior of surrounding code is undefined" It absolutely does mean that in both C and C++. If you don't like that, using a different language.
C
Caleth

Not mentioned yet, but code paths that invoke undefined behavior are allowed to do whatever the compiler wants, e.g.

void updateEffect(){}

Which is certainly faster than your correct loop, and because of UB, is perfectly conformant.


S
Shafik Yaghmour

Because of security reasons, new memory assigned to a program has to be cleaned, otherwise the information could be used, and passwords could leak from one application into another. Only when you reuse memory, you get different values than 0. And it is very likely, that on a stack the previous value is just fixed, because the previous use of that memory is fixed.


J
Jos

Your particular code example would probably not do what you are expecting. While technically each iteration of the loop re-creates the local variables for the r, g, and b values, in practice it's the exact same memory space on the stack. Hence it won't get re-randomized with each iteration, and you will end up assigning the same 3 values for each of the 1000 colors, regardless of how random the r, g, and b are individually and initially.

Indeed, if it did work, I would be very curious as to what's re-randomizing it. The only thing I can think of would be an interleaved interrupt that piggypacked atop that stack, highly unlikely. Perhaps internal optimization that kept those as register variables rather than as true memory locations, where the registers get re-used further down in the loop, would do the trick, too, especially if the set visibility function is particularly register-hungry. Still, far from random.


A
Arslan Ahmed

As most of people here mentioned undefined behavior. Undefined also means that you may get some valid integer value (luckily) and in this case this will be faster (as rand function call is not made). But don't practically use it. I am sure this will terrible results as luck is not with you all the time.


Very good point! It may be a pragmatic trick, but indeed one that requires luck.
There is absolutely no luck involved. If the compiler does not optimize the undefined behavior away, the values that you get will be perfectly deterministic (= depend entirely on your program, its inputs, its compiler, the libraries it uses, the timing of its threads if it has threads). The problem is that you can't reason about these values since they depend on implementation details.
In the absence of an operating system with an interrupt-handling stack separate from the application stack, luck may well be involved, since interrupts will frequently disturb the contents of memory slightly beyond the current stack contents.
F
Frankie_C

Really bad! Bad habit, bad result. Consider:

A_Function_that_use_a_lot_the_Stack();
updateEffect();

If the function A_Function_that_use_a_lot_the_Stack() make always the same initialization it leaves the stack with the same data on it. That data is what we get calling updateEffect(): always same value!.


B
Barmar

I performed a very simple test, and it wasn't random at all.

#include <stdio.h>

int main() {

    int a;
    printf("%d\n", a);
    return 0;
}

Every time I ran the program, it printed the same number (32767 in my case) -- you can't get much less random than that. This is presumably whatever the startup code in the runtime library left on the stack. Since it uses the same startup code every time the program runs, and nothing else varies in the program between runs, the results are perfectly consistent.


Good point. A result strongly depends on where this "random" number generator is called in the code. It is rather unpredictable than random.
Z
Zsolt Szatmari

You need to have a definition of what you mean by 'random'. A sensible definition involves that the values you get should have little correlation. That's something you can measure. It's also not trivial to achieve in a controlled, reproducible manner. So undefined behaviour is certainly not what you are looking for.


s
supercat

There are certain situations in which uninitialized memory may be safely read using type "unsigned char*" [e.g. a buffer returned from malloc]. Code may read such memory without having to worry about the compiler throwing causality out the window, and there are times when it may be more efficient to have code be prepared for anything memory might contain than to ensure that uninitialized data won't be read (a commonplace example of this would be using memcpy on partially-initialized buffer rather than discretely copying all of the elements that contain meaningful data).

Even in such cases, however, one should always assume that if any combination of bytes will be particularly vexatious, reading it will always yield that pattern of bytes (and if a certain pattern would be vexatious in production, but not in development, such a pattern won't appear until code is in production).

Reading uninitialized memory might be useful as part of a random-generation strategy in an embedded system where one can be sure the memory has never been written with substantially-non-random content since the last time the system was powered on, and if the manufacturing process used for the memory causes its power-on state to vary in semi-random fashion. Code should work even if all devices always yield the same data, but in cases where e.g. a group of nodes each need to select arbitrary unique IDs as quickly as possible, having a "not very random" generator which gives half the nodes the same initial ID might be better than not having any initial source of randomness at all.


"if any combination of bytes will be particularly vexatious, reading it will always yield that pattern of bytes" -- until you code to cope with that pattern, at which point it is no longer vexatious and a different pattern will be read in future.
@SteveJessop: Precisely. My line about development vs production was intended to convey a similar notion. Code shouldn't care about what's in uninitialized memory beyond a vague notion of "Some randomness might be nice". If program behavior is affected by the contents of one piece of uninitialized memory, the contents of pieces that are acquired in future may in turn be affected by that.
S
Shafik Yaghmour

As others have said, it will be fast, but not random.

What most compilers will do for local variables is to grab some space for them on the stack, but not bother setting it to anything (the standard says they don't need to, so why slow down the code you're generating?).

In this case, the value you'll get will depend on what was on previously on the stack - if you call a function before this one that has a hundred local char variables all set to 'Q' and then call you're function after that returns, then you'll probably find your "random" values behave as if you've memset() them all to 'Q's.

Importantly for your example function trying to use this, these values wont change each time you read them, they'll be the same every time. So you'll get a 100 stars all set to the same colour and visibility.

Also, nothing says that the compiler shouldn't initialize these value - so a future compiler might do so.

In general: bad idea, don't do it. (like a lot of "clever" code level optimizations really...)


You're making some strong predictions about what will happen although none of that is guaranteed due to UB. It also is not true in practice.
C
Community

As others have already mentioned, this is undefined behavior (UB), but it may "work".

Except from problems already mentioned by others, I see one other problem (disadvantage) - it will not work in any language other than C and C++. I know that this question is about C++, but if you can write code which will be good C++ and Java code and it's not a problem then why not? Maybe some day someone will have to port it to other language and searching for bugs caused by "magic tricks" UB like this definitely will be a nightmare (especially for an inexperienced C/C++ developer).

Here there is question about another similar UB. Just imagine yourself trying to find bug like this without knowing about this UB. If you want to read more about such strange things in C/C++, read answers for question from link and see this GREAT slideshow. It will help you understand what's under the hood and how it's working; it's not not just another slideshow full of "magic". I'm quite sure that even most of experienced C/c++ programmers can learn a lot from this.


C
Community

Not a good idea to rely our any logic on language undefined behaviour. In addition to whatever mentioned/discussed in this post, I would like to mention that with modern C++ approach/style such program may not be compile.

This was mentioned in my previous post which contains the advantage of auto feature and useful link for the same.

https://stackoverflow.com/a/26170069/2724703

So, if we change the above code and replace the actual types with auto, the program would not even compile.

void updateEffect(){
    for(int i=0;i<1000;i++){
        auto r;
        auto g;
        auto b;
        star[i].setColor(r%255,g%255,b%255);
        auto isVisible;
        star[i].setVisible(isVisible);
    }
}

D
DDan

I like your way of thinking. Really outside the box. However the tradeoff is really not worth it. Memory-runtime tradeoff is a thing, including undefined behavior for runtime is not.

It must give you a very unsettling feeling to know you are using such "random" as your business logic. I woudn't do it.


S
Shafik Yaghmour

Use 7757 every place you are tempted to use uninitialized variables. I picked it randomly from a list of prime numbers:

it is defined behavior it is guaranteed to not always be 0 it is prime it is likely to be as statistically random as uninitualized variables it is likely to be faster than uninitialized variables since its value is known at compile time


For comparison, see the results in this answer: stackoverflow.com/a/31836461/2963099
x
xyz

There is one more possibility to consider.

Modern compilers (ahem g++) are so intelligent that they go through your code to see what instructions affect state, and what don't, and if an instruction is guaranteed to NOT affect the state, g++ will simply remove that instruction.

So here's what will happen. g++ will definitely see that you are reading, performing arithmetic on, saving, what is essentially a garbage value, which produces more garbage. Since there is no guarantee that the new garbage is any more useful than the old one, it will simply do away with your loop. BLOOP!

This method is useful, but here's what I would do. Combine UB (Undefined Behaviour) with rand() speed.

Of course, reduce rand()s executed, but mix them in so compiler doesn't do anything you don't want it to.

And I won't fire you.


I find it very hard to believe a compiler can decide your code is doing something silly and remove it. I'd expect it only to optimise away unused code, not inadvisable code. Do you have a reproducible test case? Either way, the recommendation of UB is dangerous. Plus, GCC isn't the only competent compiler around, so it's unfair to single it out as "modern".
d
dbush

Using uninitialized data for randomness is not necessarily a bad thing if done properly. In fact, OpenSSL does exactly this to seed its PRNG.

Apparently this usage wasn't well documented however, because someone noticed Valgrind complaining about using uninitialized data and "fixed" it, causing a bug in the PRNG.

So you can do it, but you need to know what you're doing and make sure that anyone reading your code understands this.


This is going to depend in your compiler which is expected with undefined behavior, as we can see from my answer clang today will not do what they want.
That OpenSSL used this method as an entropy input does not say that it was any good. After all, the only other entropy source they used was the PID. Not exactly a good random value. From someone who relies on such a bad entropy source, I won't expect good judgement on their other entropy source. I just hope, the people who currently maintain OpenSSL are brighter.