come on bro be real no lock her up no bump stocks no wall swamp still stinky should I go on? doing better than I thought, but not quite delivering
Waiting until everything aligns then we will see the axe fall. When we get the vote in November all we will see is a sea of red at all levels of government.
There is a ton of truth to this and lies are spread by all sides. Propaganda is real. Like erasing history or distortion of truth, it gets embedded in the psyche of the peeps and then is carried forward....distorted. One example is why are we in Afghanistan? It’s all about world resources...all governments are playing the same game since beginning of time. Don’t be naive to think that right or left is something to hold on to. Sure there are emotional and moral deviations but our government like so many are pulling the wool and simple minded cats don’t know it’s happening! Y’all have a good weekend and I’m glad to see DP got upright at the bar! Auuuuwrite peace!!!
By Hilke Schellmann WSJ Oct. 15, 2018 5:29 a.m. ET Seeing isn’t believing anymore. Deep-learning computer applications can now generate fake video and audio recordings that look strikingly real. In a recent video published by researchers to show how the technology works, an actor sits in front of a camera moving his face. The computer then generates the same expressions in real time on an existing video of Barack Obama. When the actor shakes his head, the former president shakes his head as well. When he speaks, Mr. Obama speaks as well. “This is a big deal,” Hany Farid, computer science professor at Dartmouth College, told The Wall Street Journal. “You can literally put into a person’s mouth anything you want.” Prof. Christian Theobalt, part of a team working on the technology at the Max-Planck-Institute for Informatics in Germany, said he is motivated by the creative possibilities that it holds for the future. ‘ I know I’m going to lose this game... At the end of the day as the analyst trying to distinguish the real from the fake I will lose. ’ —Hany Farid, Dartmouth College He said researchers have developed forensic methods to detect fakes. But Prof. Farid says researchers who push computer-generated technology need to think about the consequences these computer-generated fakes could have for society. He believes forensic experts are being outpaced by the development of fakes and that there is no method yet that can detect them all. “How are we going to believe anything anymore that we see? And so to me that’s a real threat to our democracy,” Mr. Farid said. In the video above, WSJ’s Jason Bellini explores this world of realistic video fakes. He gets deepfaked himself, and thanks to a deep-learning application, he can now dance like Bruno Mars. He also learns of the dark side of this technology, through one victim whose life has been deeply affected by deepfakes, and why others believe they could even lead to war.
"Your Honor, this photo of my client in bed with this person is inadmissible as evidence on the basis that with modern technology, one could have a photo of my client in bed with anyone, including Your Honor." The quote is from a chapter in the book Media Lab, published in the mid 1980s, and the attribution is "some lawyer, any day now." That day is now. This is really scary stuff.