Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But if the state approaches a technology with intent it is usually for the purposes of a military offence. I don't think that is a good idea in the context of AI! Although I also don't think there is any stopping it. The US has things like DARPA for example and a lot of Chinese investment seems to be done with the intent of providing capabilities to their army.

The list of things states have attempted to deploy offensively is nearly endless. Modern operations research arguably came out of the British empire attempting (succeeding) to weaponise mathematics. If you give a state fertiliser it makes bombs, if you give it nuclear power it makes bombs, if you give it drones it makes bombs, if you give it advanced science or engineering of any form it makes bombs. States are the most ingenious system for turning things into bombs that we've ever invented; in the grand old days of siege warfare they even managed to weaponise corpses, refuse and junk because it turned out lobbing that stuff at the enemy was effective. The entire spectrum of technology from nothing to nanotech, hurled at enemies to kill them.

We'd all love if states commit to not doing evil but the state is the entity most active at figuring out how to use new tech X for evil.



This is an extremely reductive and bleak way of looking at states. While military is of course a major focus of states, it is very far from being the only one. States both historically and today invest massive amounts of resources in culture, civil engineering (roads, bridges, sanitation, electrical grids, etc), medicine, and many other endeavors. Even the software industry still makes huge amounts of money from the state, a sizable portion is propped up by non-military government contracts (like Microsoft selling Windows, Office, and SharePoint to virtually all of the world's administrations).


quick devil’s advocate on a tangential point. is designer better killing tools necessarily evil? seems like the nature of the world is eat or be eaten and on the empire-scale, conquer or be conquered. that latter point seems to be the historical norm. Even with democracy, reasoning doesnt prevail but force of numbers seems to be the end determiner. Point is, humans arent easy to reason with or negotiate, coercion is the dominant force through out history especially when dealing with groups of different values.

if one groups gives up the arms race of ultimate coercion tools or loses a conflict then they become subservient to the winners terms and norms (japan, germany, even Britain and France plus all the smaller states in between are subservient to the US)


> is design[ing] better killing tools necessarily evil?

Who could possibly have predicted that the autonomous, invincible doomsday weapon we created for the good of humanity might one day be used against us?


yes from an idealist perspective or eventualist, its evil. but from the perspective of if you dont stay competitively capable of deadly force you becomes some other country’s bitch, eventually. I’m not sure how much luxury nations and humans have to be pacifists. As we are seeing time and time again, but now with Europe, being pacifists means the non-pacifists calls the shot and to one degree or another they become subservient to the will of the nonpacifist. its from that perspective im arguing making autonomous deadly weapons s that might ultimately be the demise of humanity seems reasonable and not evil.


Frankly, I'd rather "become some other country's bitch, eventually" than immediately go out and risk annihilating all mankind. I don't think that's the choice, but even if it were I think the moral choice is to not play the game. Or at least give the other side a chance to not participate in the arm's race. China didn't start this, Russia didn't start this, we did. They are the ones trying to catch up. We don't know whether they'd continue running if we were to try and stop.


> is design[ing] better killing tools necessarily evil?

Great question! To add my two cents. I think many people here is missing an uncomfortable truth that given enough motivation to kill other humans, people will re-purpose any tool into a killing tool.

Just have a look at the battlefields in the Ukraine where the most fearsome killing tool is a FPV drone. A thing that just few years back was universally considered a toy.

Whether we like it or not any tool can be a killing tool


> seems like the nature of the world is eat or be eaten

Surely this applies to how individuals consider states, too. States generally wield violence, especially in the context of "national security", to preserve the security of the state, not its own people. I trust my own state (the usa) to wield the weapons it funds and purchases and manufactures about as much as I trust a baby with knives taped to its hands. I can't think of anything on earth that puts me in as much danger as the pentagon does. Nukes might protect the existence of the federal government but they put me in danger. Our response to 9/11 just created more people that hate my guts and want to kill me (and who can blame them?). No, I have no desire to live in a death cult anymore, nor do I trust the people who gravitate towards the use of militaries to not act in the most collectively suicidal way imaginable at the first opportunity.


> I can't think of anything on earth that puts me in as much danger as the pentagon does

Possibly true, but the state is also responsible for the policing that means the pentagon is your greatest danger.


yeah it sucks, but if the US gave up its death cult ways then youd still probably eventually live in one as a new conquering force fills in the void which seems inevitably going by history.


> the nature of the world is eat or be eaten

The nature of the world is at our finger tips, we are the dominant species here. Unfortunately we are still apes.

The enforcement of cooperation into a society does not always require a sanctioning body. Seeing it from a skynet-military perspective is one sided but unfortunately a consequence of poppers tolerance paradox. If you uphold (eg. pacifistic or tolerant) ideals, that require cooperation of others, you cannot tolerate opposition or you might loose your ideal.

That said, common sense can be a tool to achive the same. Just look at the common and hopefully continuous ostracism of nuclear weapons.

IMO its a matter of zeitgeist and education too and un/fortunately, AI hits right in that spot.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: