Thoughts on various subjects
Is the speed of gravitons higher than the speed of photons? I believe so.
Let's take black holes. A black hole has a so called (photonic) horizon beyond which photons can't go.
Since black holes have a photonic horizon, we can also say they have a gravitational horizon, beyond which gravitons can't go. If the gravitational horizon is smaller than the photonic horizon, there can be no photonic horizon because the photons would escape the gravitational attraction. Therefore, for a black hole to exist, its gravitational horizon must be bigger than its photonic horizon.
Two parameters influence the size of the horizon: mass and speed (of gravitons or photons). Assuming that the speed of graviton is equal with the speed of photons, we see that if the mass of gravitons is smaller than the mass of photons, gravitons would go farther away from the black hole than photons would go.
There are certain experiments with gravitons which suggest that the movement mass of gravitons is higher than the movement mass of visible photons. As such, the gravitational horizon would be smaller than the photonic horizon and black holes could not exist.
But if the speed of gravitons would be bigger than the speed of photons, the gravitational horizon would be bigger than the photonic horizon.
This makes us see that there can exist gravitational holes, objects which are so heavy that even gravitons can't go beyond the gravitational horizon of the gravitational holes.
If they existed, they would be here.
Why assume they are still alive to be here? Why assume their presence could be detected by humans? In what form would they be here?
Intelligent beings with technologies advanced millions of years beyond our own, who spread to the far ends of the galaxy, should not be difficult to detect.
If these civilizations are so advanced that they spread throughout the Universe, why should we assume they use the technology we use (like electromagnetic energy)? Surely they have already harnessed the power of tachyons. So, if they don't use our technology, why assume we can detect them?
We already possess the means to detect even primitive technological civilizations like our own at a distance of hundreds of light years.
Can we really? Our own electromagnetic emissions are week because we don't try to communicate with extraterrestrial civilizations. As for those emissions we do emit in an attempt to communicate with extraterrestrial civilizations, we started them a few decades back and on a limited scale. The interception of such signals are just as limited.
The development level of a civilization has two barriers: deep-space travel with sub-light speed, and deep-space travel with super-light speed. Let's say that civilizations in the first case belong to class 2, and that civilizations in the second case belong to class 3; class 1 is for civilizations which are not yet capable or deep-space travel.
In the first case, even if a civilization does make sub-light speed travel, it would be confronted with the following problems: the ship would have to send information back to the origin, the origin would have to intercept electromagnetic waves of extremely low energy, and the ship would have to withstand many years of interstellar travel. Therefore, the probability to detect extraterrestrial life with this method is extremely low, virtually zero.
The second case means that the civilization which travels in deep-space has reached a certain technological level (one method to travel is explained in The Universe section). However, this level can't be achieved with simple technological advances, but also requires mental development, or else the civilization would self-destruct with its own (destructive) technology.
If intelligent life is common then why, over the billions of years that preceded our appearance, has no species already filled the galaxy?
The meaning of life may be to reproduce and expand, but what is the meaning of life capable of deep-space travel with super-light speed?
Having reached a higher mental level than the human civilization has, a civilization would see no more reason to expand, at least not in areas where other civilizations exist. Thinking that mentally evolved civilizations would destroy other civilizations in order to expand their presence is a mere nightmare of the human mind.
A class 3 civilization has no reason to destruct, because it has already overcome the destructive phase of its existence, otherwise it would have self-destruct (with its own technology).
Of course, some people might believe that there could be civilizations that somehow reached deep-space travel with super-light speed before they could have self-destruct. Such people have to be reminded that there is not only one class 3 civilization in the Universe, but many, and if one would try to destroy others then it would be stopped by the most "sane" class 3 civilizations.
Another reason for not expanding is the fact that such civilizations have reached a level where resources are very well balanced. These civilizations are capable of creating matter from energy; the technologies would be similar with those used for super-light speed travel. The energy source may be obtained from sources that are essentially different than anything the human civilization knows; there is no reason for humans to believe energy can't be also created from mere space, and for that there are the "strings" theory and the "Big-Bang" theory. But simplifying, creating matter from energy is available even to the human civilization, though only on small scale and at prohibitive costs.
Class 3 civilizations understand it is imperative not to interfere with civilizations that have not reached the same class of technological and mental development. Such interferences would be catastrophic for the less evolved civilization.
One quick problem would that if the class 3 civilization would give access to the less evolved civilization, the latter would cease to evolve since it already has "the future" in the present. The fire of the "new" civilization would die. No scientist would be able to invent anymore because his work has already been done by the more evolved civilization. This would mean death for the youngsters, and loneliness for the elders.
No civilization is capable of taking in its hands the balance of Nature. Nature is to be left to its own course. This is what mentally evolved civilizations (and even humans) have learned in their many years of evolution.
Class 3 civilizations have no need to expand at the expense of others, and therefore they have no need to "be here". And even if they are "here" to study us, they certainly don't need to show themselves to us.
The human civilization is at the very threshold of class 3 civilization. However, it didn't achieve the necessary mental evolution to move into class 3. If, now, it would obtain the technology to travel in deep-space with super-light speed, it would only succeed to spread its terror and nightmares throughout the Universe.
So, you don't think toilet paper can be improved, huh? It can be!
Most toilet papers break easily, or break hard but feel like sandpaper, or get stuck inside...
I wonder why nobody produces oiled toilet paper?! The oil would ensure than the paper would have few layers, be resistant but also soft, and would slide in and out without problems, thus providing a pleasant experience (instead of an, otherwise, unpleasant one).
You could try to see for yourself how great it is. Simply spread a tinny amount of some body cream on a piece of toilet paper and use that to wipe yourself. Spread the cream on the middle of the paper; there is really no need to spread it all over the paper, and that would actually be bad because you wouldn't be able to properly keep the paper.
Even more, the oil could have therapeutical properties. I guess there could be two types of oil, but also just one (if it doesn't cost much more).
I wonder if the motto of such a producer could be "We're in and out without anybody noticing!" :)
A suggestion I made to various search engines. (Not the only one, but an important idea which has high chances to be implemented.)
As you know, today's Internet searches return overwhelming amounts of garbage.
To solve this problem, allow users to create personal accounts on your search engine. This would let each user to create its own list of links where your search engine would search the given keywords.
Users can collect links from everywhere (including from places dedicated to gather manually-checked sites) and add them to their search engine personal profile, and later search through these links (instead of the entire Internet). This can add up tens or even hundreds of thousands of links (a lot of data to transfer for every search from the user's computer, so you can't put this list in cookies, and thus you need personal accounts).
Also, you could use these lists of links to see which websites users think are good enough to search (and you can improve their ranks). Monitor if these links are actually used for search, and don't just sit there; the more they are used, the higher ranking you give them. Basically, people do the sorting for you, for free.
Of course, some people might want to profit from this by creating accounts with garbage links which are only intended to improve the ranks of various websites.
One way to combat this would be to charge users to open a personal account, like 5 USD / account, but you have to charge only to open the account (not for searches).
A suggestion I made to various developers of applications which can manage a partition in an encrypted file.
Hidden containers, as they are implemented now, is a useless feature from my point of view because data can't be written in the outer container (without worry of destroying the hidden container). In order for this feature to be useful, I personally need to be able to store all my work in an encrypted container (and so make changes to it all the time), and also, perhaps, have a hidden container for truly private data.
Previously, I told you about splitting a container in two parts, one of which would be the "hidden" one, but you rejected that idea. (I don't even know if Windows is able to handle two partitions in the same file.)
Since I don't want to do significant changes to your program and use my own version, I have a suggestion which you may want to implement as a choice when the program's API will be released.
I would like to be able to create my own GUI for the program's driver. From there, I want to create the hidden container as another encrypted file in the outer encrypted file. Basically, when the user would create a new encrypted container A, my application would tell to your program to also create another encrypted file B inside A, of a specified size (which would be, for example, 30% from the size of A).
So, the "hidden" container B would actually be a simple file inside container A. Since B would always be created, there is plausible deniability for users because they have no choice to not create B.
Of course, a password for container B would be asked when container A is created. If none is provided, a random password would be used and the file for hidden data is basically a dummy file; the random password is necessary for plausible deniability. If later the user wants to use container B, he would have to go through a process which resets the password for container B, because he can't just change the password (since this is random).
Dummy files ( or any other data containers) are a form of steganography: they might contain precious data, or they might not. Nobody can tell even though everyone sees them.
There is one issue. The file for container B could be deleted by the user. For this, there is a solution which has to be implemented in your encryption driver. Every time when container A is mounted, check if the file for container B exists (and has the right size) and if it doesn't then recreate it (if there is enough space).
If there is not enough space, then there is no problem since at this point is not essential to recreate B. Actually, this is an advantage since users can simply delete the file for container B in order to use the entire space of container A. Plausible deniability is still maintained because if there is enough space, the file for container B would be recreated anyway.
As a safety measure, never allow the deletion of the file for container B from a file manager, but only from your program.
The advantages of this method are:
The disadvantages of this method are:
And, who knows, maybe you'll think this method is good enough for you to implement it directly in your program.