%100 me when I first started github: “welp its saying something I dont understand, time to nuke the local copy and restart”
Oh hi chancellor of Germany lol
Israel definitely has the right to exist. But so do a hundred odd other countries. Yet you do not acknowledge each of them one by one. All Germany has to do going forward is provide the necessary education on dangers of anti semitism, fascism and take this task very seriously. This is just meaningless over compensation or even worse maybe the result of meaningless lobbying.
Can create faces that have never existed, but can you guarantee that the child in a CP that it has created does not look identical to a child that already exists? after all it can very well produce something using children directly from or very similar to its training set.
yea well there is no way to guarantee that AI wont spew out CP where the child there looks exactly like a child that it has seen in its training set, i.e a child that really exists. so no go
better yet upload gigabytes of senseless text and photos and let Microsoft train their AI on that
proceeds to justify the cost of unpaid peer reviewed digital publishing using pie charts and bar plots
I guess the end result would be the same. But at large the economic system and human nature would be to blame which is actually what I am trying to blame here too, not AI but people in power who abuse AI and steer it towards hype and profit
nope doesnt help, imma get my flame thrower nevertheless
“By involuntarily uploading your data to onedrive you also agree for it to be used in training AI models”
They even go to the bathroom and bedroom with their shoes. Un-fucking-believable
For instance, I would be completely fine with this if they said “We will train it on a very large database of articles and finding relevant scientific information will be easier than before”. But no they have to hype it up with nonsense expectations so they can generate short term profits for their fucking shareholders. This will either come at the cost of the next AI winter or senseless allocation of major resources to a model of AI that is not sustainable in the long run.
deleted by creator
I am not denying the positive use cases being employed now and possibly being employed in the future. I am not opposing the use/development of AI tools now and in in future too.
However the huge negative possibilities are very real too and is/will be effecting humanity. I am against the course big AI companies seem to be taking and against the possible future allocation of most of major tech innovations to their cause.
It is of course very hard to predict how the positives and negatives will balance out but I think big tech companies don’t have any interest in balancing this out. They seem to be very short sighted for anything other than direct profits. I think they will take the easiest way to more profit/AI dominance which is a short term investment. So I am not very optimistic on how it will pan out. Maybe I am wrong and like computers it will open up a whole new world of possibilities. But the landscape then and landscape now is also quite different in terms of how big tech companies and richest people act.
I feel like all the useful stuff you have listed here is more like advanced machine learning and different than the AI that is being advertised to the public and being mostly invested in. These are mostly stuff we can already do relatively easily with the available AI (i.e highly sophisticated regression) for relatively low compute power and low energy requirements (not counting more outlier stuff like alpha fold which still requires quite a lot of compute power). It is not related to why the AI companies will need to own most of the major computational and energy innovations in the future.
It is the image/text generative part of AI that looks more sexy to the public and that is therefore mostly being hyped/advertised/invested on by big AI companies. It is also this generative part of AI that will require much bigger computation and energy innovations to keep delivering significantly more than it can now. The situation is very akin to the moon race. It feels like trying to develop the AI on this “brute force” course will deliver relatively low benefits for the cost it will incur.
I take it that this was social sciences because based on what I have seen so far I don’t think it can even outperform a college kid in maths