Technology StocksThe Singularity, A.I.: Machine & Deep Learning,. and GFA

Previous 10 Next 10 
To: koan who wrote (77)9/27/2017 8:48:12 PM
From: dvdw©
   of 229
That POV is right up your alley....cognitive dissonance and all that goes with it, boils down to the biology of inadequate belief .Remember peak oil, that wasteland of obfuscation?

Momentum comes and goes as fashions dictate. (politics in the cesspool, as they are ) you must have new instructions, new terrain is always bought and paid for.

Counting lines of output and input, are governed under the terms of counter programming Popper. RO/RS=CF This equation replaces e=mc2 but you will never get it.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: dvdw© who wrote (78)9/27/2017 9:03:48 PM
From: koan
   of 229
You are right I have no idea what that posts says. Neither does anyone else I think :).>

Especially that first paragraph.

AI, the singularity, will be about complex thought, higher order thinking.

What that will turn out to be, who knows, only that we had better learn to think logically and in complex manner.

Ordinary thinking will not cut it any more.

Brave new world.


That POV is right up your alley....cognitive dissonance and all that goes with it boils down to the biology of inadequate belief.

Momentum comes and goes as fashions dictate.

Counting lines of output and input, are governed under the terms of counter programming Popper. RO/RS=CF This equation replaces e=mc2 but you will never get it.

Share RecommendKeepReplyMark as Last Read

To: John P who wrote (74)9/27/2017 9:33:10 PM
From: Glenn Petersen
   of 229
The formation of the Vision Fund by Masayoshi Son is either a sign of a market top or an act of genius. Mr. Son has been down this road before.

As the bubble burst, he reportedly lost $70 billion in one day. He admits that 99% of his net worth was wiped out in 2000

Message 31282818

Share RecommendKeepReplyMark as Last Read

To: koan who wrote (59)9/29/2017 7:59:15 PM
From: Doren
   of 229
Thanks for the Kudos...

- I've put Homo Deus on my booklist, which is really for tomorrow when I tire of the internet and start reading again. When movies and music are no longer cheap. I'm currently collecting books, mostly by classic great writers and philosophers. But only on the cheap.

I'm currently watching a 24 hour lecture series on Great Britain from the Tudors through the Stewarts. There is plenty of free video on the web now. You can watch lecture courses that would cost thousands at Harvard et al for free now. European history has been an obsession for some time for me since our world view formed primarily there, after the reformation and the enlightenment.

- Hippies

I didn't blame them necessarily, especially not the real hippies (as opposed to posers.) I was one of the real ones growing up 2 hours from Haight Asbury in Sacramento during the 60s. I just think its ironic that it turned out like it did.

What I meant was we were all obsessed with the future, Star Trek... the naive idea that our ideas would change the world into a near Star Trekian utopia. We failed to a large degree to change anything...

What it was was a modern enlightenment... it changed the world view of many... but not enough.

We didn't see AI and Genetic Engineering coming so soon and merging. So its ironic that many of the San Francisco people/hippies who created the personal computer industry, because they were utopian futurists, created a business that enabled and, I think, inevitably will lead to the end of the natural human race. People like Jobs, Wozniak, the Knoll brothers and Robert Abel who was a friend of mine:

Robert Abel WIKI

It would have been hard for anyone to see that so I don't blame them. Computers have brought us many good things. I write music on my computer, very good music I think.

You can listen free here, I recommend the Unsound Educational CD

But none the less I think we are toast... whether than is the natural evolution scenario after all is anyone's guess at this time.

For myself, I'm 64, just about to semi-retire and hopefully live in the woods doing what I want, as unaffected by coming technology as is possible. I don't think I'll manage to live until immortality comes... which is right after the bionic/AI merge.

Share RecommendKeepReplyMark as Last Read

From: The Ox10/2/2017 2:37:08 PM
   of 229

What do you think about the current debate about artificial intelligence? Elon Musk has said it poses an existential threat to humanity.

Technology has always been a double-edged sword, since fire kept us warm but burned down our houses. It’s very clear that overall human life has gotten better, although technology amplifies both our creative and destructive impulses. A lot of people think things are getting worse, partly because that’s actually an evolutionary adaptation: It’s very important for your survival to be sensitive to bad news. A little rustling in the leaves may be a predator, and you better pay attention to that. All of these technologies are a risk. And the powerful ones—biotechnology, nanotechnology, and A.I.—are potentially existential risks. I think if you look at history, though, we’re being helped more than we’re being hurt.

How will artificial intelligence and other technologies impact jobs?

We have already eliminated all jobs several times in human history. How many jobs circa 1900 exist today? If I were a prescient futurist in 1900, I would say, “Okay, 38% of you work on farms; 25% of you work in factories. That’s two-thirds of the population. I predict that by the year 2015, that will be 2% on farms and 9% in factories.” And everybody would go, “Oh, my God, we’re going to be out of work.” I would say, “Well, don’t worry, for every job we eliminate, we’re going to create more jobs at the top of the skill ladder.” And people would say, “What new jobs?” And I’d say, “Well, I don’t know. We haven’t invented them yet.”

That continues to be the case, and it creates a difficult political issue because you can look at people driving cars and trucks, and you can be pretty confident those jobs will go away. And you can’t describe the new jobs, because they’re in industries and concepts that don’t exist yet.

Share RecommendKeepReplyMark as Last ReadRead Replies (3)

To: The Ox who wrote (82)10/2/2017 2:58:45 PM
From: The Ox
   of 229

Predicting couple therapy outcomes based on speech acoustic features
IntroductionBehavioral Signal Processing (BSP) [ 1, 2] refers to computational methods that support measurement, analysis, and modeling of human behavior and interactions. The main goal is to support decision making of domain experts, such as mental health researchers and clinicians. BSP maps real-world signals to behavioral constructs, often abstract and complex, and has been applied in a variety of clinical domains including couples therapy [ 1, 3, 4], Autism Spectrum Disorder [ 5], and addiction counseling [ 6, 7]. Parallel work with focus on social context rather than the health domains can be found in [ 8, 9]. Notably, couple therapy has been among one of the key application domains of Behavioral Signal Processing. There have been significant efforts in characterizing the behavior of individuals engaged in conversation with their spouses during problem-solving interaction sessions. Researchers have explored information gathered from various modalities such as vocal patterns of speech [ 3, 4, 10, 11], spoken language use [ 1, 12] and visual body gestures [ 13]. These studies are promising towards the creation of automated support systems for psychotherapists in creating objective measures for diagnostics, intervention assessment and planning. This entails not only characterizing and understanding a range of clinically meaningful behavior traits and patterns but, critically, also measure behavior change in response to treatment. A systematic and objective study and monitoring of the outcome relevant to the respective condition can facilitate positive and personalized interventions. In particular, in clinical psychology, predicting (or measuring from couple interactions, without couple, or therapist provided metrics) the outcome of the relationship of a couple undergoing counseling has been a subject of long-standing interest [ 1416].

Many previous studies have manually investigated what behavioral traits and patterns of a couple can tell us of their relationship outcome, for example, whether a couple could successfully recover from their marital conflict or not. Often the monitoring of outcomes involves a prolonged period of time post treatment (up to 5 years), and highly subjective self reporting and manual observational coding [ 17]. Such an approach suffers from the inherent limitations of the qualitative observational assessment, subjective biases of the experts, and great variability in the self-reporting of behavior by the couples. Having a computational framework for outcome prediction can be beneficial towards assessment of the employed therapy strategies and the quality of treatment, and also help provide feedback to the experts.

In this article, we analyze the vocal speech patterns of couples engaged in problem-solving interactions to infer the eventual outcome of their relationship—whether it improves or not–over the course of therapy. The proposed data-driven approach focuses primarily on the acoustics of the interaction; unobtrusively-obtainable, and known to offer rich behavioral information. We adopt well-established speech signal processing techniques, in conjunction with novel data representations inspired by psychological theories to design the computational scheme for the therapy outcome prediction considered. We formulate the outcome prediction as binary (improvement vs. no improvement) and multiclass (different levels of improvement) classification problems and use machine learning techniques to automatically discern the underlying patterns of these classes from the speech signal.

We compare the prediction using features directly derived from speech with prediction using clinically relevant behavioral ratings (e.g., relationship satisfaction, blame patterns, negativity) manually coded by experts after observing the interactions. It should be noted that human behavioral codes are based on watching videos of interactions that provide access to additional information beyond vocal patterns (solely relied by the proposed prediction scheme) including language use and visual nonverbal cues.

In addition to evaluating how well directly signal-derived acoustic features compare with manually derived behavioral codes as features for prediction, we also evaluate the prediction of the outcome when both feature streams are used together.

We also investigate the benefit of explicitly accounting for the dynamics and mutual influence of the dyadic behavior during towards the prediction task. The experimental results show that dynamic functionals that measure relative vocal changes within and across interlocutors contribute to improved outcome prediction.

The outline of the paper is as follows. We discuss relevant literature in Section 1. The Couple Therapy Corpus used in the study is described in Section 1 and illustrated in Fig 1. An overview of the methodologies for speech acoustic feature extraction is given in Section 1 and the use of behavioral codes as features is described in Section 1. We provide an analysis of the proposed acoustic features in Section 1 and the results of the classification experiments in Section 1. Finally, we conclude the paper with a discussion of our findings as well as possible directions for future research in Section 1.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: The Ox who wrote (82)10/3/2017 8:00:51 AM
From: w0z
   of 229
“Well, don’t worry, for every job we eliminate, we’re going to create more jobs at the top of the skill ladder.”

What happens if a large part of today's workforce is at an ~8th grade level of education?

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: The Ox who wrote (83)10/3/2017 11:24:11 AM
From: The Ox
   of 229
Walmart spending on AI:

Message 31288137

Share RecommendKeepReplyMark as Last Read

To: The Ox who wrote (82)10/3/2017 4:02:25 PM
From: Doren
   of 229
I own a Kurzweil synth. Ray is like a lot of Techies... brilliant... no common sense... totally isolated from blue collar people, he probably lives in a protected white collar island, probably inside a fenced and gated community with its own police force.

I work on those homes. They pay me pretty well because I do custom work. Super custom. I make every absurd demand they make come true.

"we’re going to create more jobs at the top of the skill ladder"

So... we are going to fill those jobs with people with IQs of 100? Good luck. #1 people who get those jobs are increasingly the best of the best, any recruiter knows that #2 those jobs are frequently obsoleted... not many people are using COBAL anymore. I was a recruiter. As a recruiter I interviewed a lot of 40 year old techies nobody wanted anymore, both because they only knew obsoleted software/languages and because nobody wanted to hire any techies over 40 years old. Many of them were completely depressed and dejected. Kurzweil really has little in common with these kinds of people being management.

COBOL programmers were obsoleted... to the point TOO MANY LEFT THE FIELD... once a person is obsoleted they think twice about putting time into learning a new skill. A few had good stable jobs maintaining legacy systems... but no one thought about learning COBOL for decades.

Guys like Ray always remind me of this:

I would have change it... but I would have charged at LEAST $100, probably more.

Share RecommendKeepReplyMark as Last ReadRead Replies (1)

To: Doren who wrote (86)10/3/2017 4:06:07 PM
From: The Ox
   of 229
Won't AI help teach people what they need to know in a fashion that will increase their ability to execute better? Shouldn't AI be able to find ways to communicate with different levels of intelligence/ability which would help the "lesser IQ" individual show more promise?

Share RecommendKeepReplyMark as Last ReadRead Replies (1)
Previous 10 Next 10 

Copyright © 1995-2018 Knight Sac Media. All rights reserved.Stock quotes are delayed at least 15 minutes - See Terms of Use.