There is something of an issue at the moment regarding profiling people as they pass through security checkpoints in airports. The situation in the United Kingdom specifically is that black people get racially profiled and become statistically more likely to be pulled aside for a more detailed check than white people. To avoid accusations of unfairness, I'm told that some security points will pull aside a black person and then simply pull aside a randomly-chosen white person almost immediately afterwards in order to make sure that the percentages remain even, balancing up the numbers.
This is obviously ridiculous, because it fails to account for the differing percentages of each colour of human passing through the checkpoint.
How about this: the checkpoint is fitted with a random number generator. Each time a person passes through the checkpoint, the machine rolls a metaphorical dice to determine whether the person should be subjected to a detailed check or not. Because the generator would be truly random, there would be no statistical way to preferentially search any particular race or other subclass of human. There would be no way to profile. Likewise, because the generator would be subject to checks for randomness, there would be no way that a person who believed himself to be searched unfairly could object. It would make the legal questions more easy to answer.
The only question remaining would be what probability to set, and who to allow to set it. This probability could not be under the direct control of the security personnel, because then they could set the dial to 1.000 when they saw a suspicious-looking character coming. It would have to be modifiable, though, for times of high alert.