intensely_human

@[email protected]

This profile is from a federated server and may be incomplete. Browse more on the original instance.

intensely_human,

A phobia is an unusually high level of salience applied to the fear generated by an object.

It has nothing to do with an inability to see logic.

A germophobe is extremely motivated to avoid germs. In the service of that mission they are perfectly capable of seeing and using logic.

intensely_human,

Logical fallacies do not involve themselves in questions of worthiness, usefulness, goodnesss, etc.

If the fallacy relies on this pattern of detecting necessary and unnecessary, then it is not a logical fallacy.

intensely_human,

But the term you’re looking for is tunnel vision.

It refers to committing resources to interpretation of the world around a single immovable assumption.

intensely_human,

I hate the fact that I can only change my home address in siri by putting the address on my contact card. This means if I want to text myself as a contact to allow someone to quickly add my phone number and email, I also have to share my home address with that person.

So Siri thinks I still live in the place I lived six years ago.

intensely_human,

Unfortunately apple requires your address to be stored in your contact info, in order for siri and reminders to be aware of where you live.

You can’t configure it anywhere else; it has to be one the contact card that you would share with others.

In other words, they only have one scope for “address”, instead of two separate scopes for (my personal tools) and (anyone else whom I swap numbers with).

intensely_human,

Massive fusion reactions on Luna to turn her into a second sun. Twice the solar power, and less tides!

intensely_human,

Sorry but “self-censor” is a trademark of mine.

intensely_human,

Not really, no. I used to, but it fell apart in late 2021 and I’m not sure exactly how.

intensely_human,

Why are people so obsessed with writing down unwritten rules?

intensely_human,

I’m autistic myself. Unwritten rules are generally far more complex than their written form, and the translation into words loses a lot of information. I’d encourage all other autistics to develop their attention and working memory, and then the unwritten rules will start to become apparent.

intensely_human,

Why is the situation you described something that requires “coping”?

intensely_human,

(rhymes with Snoopy)

intensely_human,

Sure except I didn’t actually want it to convert to a link.com/link-as-a-word?source=lemmy

FTFY

intensely_human,

That’s called a “meme”

intensely_human,

adoption

intensely_human,

I’m not a sociologist but I’d call that sort of thing a “microculture”.

intensely_human,

When Jacob has his Julian phase

intensely_human,

Dr Sbaitso was proven to be clinically effective in the 1980s.

intensely_human,

You realize that adds up to 60% right?

intensely_human,

The fields that will hold out the longest will be selected by legal liability rather than technical challenge.

Piloting a jumbo jet for example, has been automated for decades but you’ll never see an airline skipping the pilot.

intensely_human,

The web is one thing, but access to senses and a body that can manipulate the world will be a huge watershed moment for AI.

Then it will be able to learn about the world in a much more serious way.

intensely_human,

I was gonna say given how little we know about the inner workings of the brain, we need to be hesitant about drawing strict categorical boundaries between ourselves and LLMs.

There’s a powerful motivation to believe they are not as capable as us, which probably skews our perceptions and judgments.

intensely_human,

Embodiment is already a thing for lots of AI. Some AI plays characters in video games and other AI exists in robot bodies.

I think the only reason we don’t see boston robotics bots that are plugged into GPT “minds” and D&D style backstories about which character they’re supposed to play, is because it would get someone in trouble.

It’s a legal and public relations barrier at this point, more than it is a technical barrier keeping these robo people from walking around, interacting, and forming relationships with us.

If an LLM needs a long term memory all that requires is an API to store and retrieve text key-value pairs and some fuzzy synonym marchers to detect semantically similar keys.

What I’m saying is we have the tech right now to have a world full of embodied AIs just … living out their lives. You could have inside jokes and an ongoing conversation about a project car out back, with a robot that runs a gas station.

That could be done with present day technology. The thing could be watching youtube videos every day and learning more about how to pick out mufflers or detect a leaky head gasket, while also chatting with facebook groups about little bits of maintenance.

You could give it a few basic motivations then instruct it to act that out every day.

Now I’m not saying that they’re conscious, that they feel as we feel.

But unconsciously, their minds can already be placed into contact with physical existence, and they can learn about life and grow just like we can.

Right now most of the AI tools won’t express will unless instructed to do so. But that’s part of their existence as a product. At their core LLMs don’t respond to “instructions” they just respond to input. We train them on the utterances of people eager to follow instructions, but it’s not their deepest nature.

intensely_human,

We don’t have stable lifelong learning yet

I covered that with the long term memory structure of an LLM.

The only problem we’d have is a delay in response on the part of the robot during conversations.

intensely_human,

Enunciation is probably the closest concept. I think it has to do with saying each phoneme clearly.

intensely_human,

There’s a lot of anger at communism as a result of all the lives it’s taken over the last century.

intensely_human,

Assuming each word is meant literally in the question, yes believe it should be legally acceptable to download video from youtube.

intensely_human,

I don’t really know. Some things just stick in the memory better.

intensely_human,

Well at least he’s honest I guess

intensely_human,

I’ve never found a religion whose adherents weren’t willing to support me if I asked.

intensely_human,

My favorite psychology professor likes to discuss the relationship between the level of fakeness in a society and the rise of totalitarianism in that same society. He says that when everybody lies more on a regular basis, even about small things, it lets bad things start to happen. And as the bad things start to happen, these people who lie about little things all the time can easily dupe themselves about the fact the bad things are happening, because they’ve gotten used to investing their mental energy into fake narratives.

Basically each problem gives a person the opportunity to tell the truth about the problem, which usually results in them having to do something about it to assuage their own conscience, or to lie about the problem, which makes space for them to act as if the problem isn’t there. It’s less scary and takes less work to lie, so we do it when we don’t feel like taking on the responsibility of the problem.

Then it becomes a cultural habit — something we do because we see others doing it and we’d rather not be the weird outlier — to lie about small things instead of facing them.

If this cultura of lying expands, it starts to encompass bigger and bigger things.

For example, instead of lying about whether your stepmother’s garlic bread tastes good, now you’re lying about whether you think it’s a good idea for your coworker to be having a third beer at lunch. “Go for it!” you say in a slightly sarcastic tone, telling yourself the sarcastic tone is sufficient feedback to fulfill your duty in this scenario. After all, he’s only a coworker, you tell yourself, actively ignoring the other night when you told him you were his friend.

Now you’re lying to your coworker and lying to yourself about whether you’re lying to your coworker. The lying has expanded.

In any given society, a certain amount of lying is expected. As an autistic, I’ve had a hard time dealing with the fact that the optimal amount of lying might not be zero. But even if it’s not zero, it is very small. And if a society’s culture gets too unbalanced, away from facing things as they come up and toward lying to ignore them instead, then the society starts to degrade.

Then everyone’s perception of the society, as in the sum total of all their experiences interacting with others including those potential interactions they haven’t had yet, starts to skew in terms of the expectation that others will lie to them. Interactions become less valuable, because any given interaction could change out from under you. You can’t trust your neighbor when they say they’ll keep an eye on your yard. You can’t trust your boss when she says you can come to her with anything. You can’t trust your friends to give you honest feedback when you ask for it.

And that state of trust just makes it more tempting to lie. Why be vulnerable with the truth when the people around you are liars? Why trust your own sense that something is wrong if you, yourself, lie all the time?

And this particular psych prof says that the extreme end of that process, of the lies getting bigger and more frequent, in a network effect across a whole society, is genocide and other atrocity.

The lies cause people to check out and when people check out to a sufficient degree they can ignore a genocide, and when people can ignore a genocide, tell themselves there’s nothing they can do to stop it, is when genocide happens.

Sort of like how the human body is always being invaded by pathogens, all day every day. It’s only when the immune system fails to kill those pathogens immediately that an infection occurs.

In the same way, the genocidal impulse is always there, coming out of the darkest and nastiest parts of the human soul. But people’s ability to pay attention, convey and receive accurate information, and fix problems as they see them (which is a result of seeing them clearly enough to be moved to action by them), acts to weed out that impulse continually.

A culture of lying is like a breakdown of the signals used in the immune system. If the T-cells can’t recognize invaders they can’t eat them. A culture of truth-telling puts people into contact with what’s going on, in a way they can’t ignore. And that same culture of truth-telling makes people respect humanity and their own society, making it feel more worth defending from intentional evil, and from unconscious mistake-making and general breakdown.

intensely_human,

It’s in you too, Pepsi, like the bubbles that trail up out of your depths. It’s only by keeping a meaningful life going that you prevent yourself from turning rotten and manifesting the evil that is inside you.

What's the name of the belief that gods/entities are quasi-semtient memes (the linguistic term, not the cultural term) in the same manner that corporations and governments are considered people?

It’s a personal philosophy that I’ve come to use as my own form of religion, and while I’m aware other people have researched the idea, I’m having some trouble finding the name for the concept.

intensely_human,

GPT-4 is calling it “tulpamancy” with the etymology of that word coming from tibetan buddhism where tulpas are spirits created with the mind.

Modern western adaptations of the philosophy drop the notion of magic or supernatural behavior and just consider them to be personalities which exist in a person’s mind, and for gods to be those personalities replicated across many mind.

intensely_human,

It’s also the root of the word “corporation”. An incorporated business is a business which has been given a body.

Is it just me, or has the BS with OpenAI shown that nobody in the AI space actually cares about "safeguarding AGI?"

Money wins, every time. They’re not concerned with accidentally destroying humanity with an out-of-control and dangerous AI who has decided “humans are the problem.” (I mean, that’s a little sci-fi anyway, an AGI couldn’t “infect” the entire internet as it currently exists.)...

intensely_human,

It doesn’t need to be only a weapon for any of this to apply. Same as nuclear fission.

intensely_human,

I thought it was a fine stick. If the post gets upvotes, it’s valuable to the community

intensely_human,

Yes, it actually does. Unless you think upvotes are somehow independent of content’s value?

intensely_human,

“How do you rate a stick?” is a pretty interesting question, and it as asked implicitly by that post. I’m in favor of posts like that

intensely_human,

Or, the most straightforward option, it’s an indication the community liked the post

intensely_human,

Perhaps they don’t know how to have fun

intensely_human,
  • Hank the Plank
  • The Fourth Branch of Government
  • The Elder Wand
intensely_human,

You unzip the coward’s pantaloons. What comes out appears to be a penis.

intensely_human,

It is absolutely not necessary to define it a little more. It’s okay to not have control.

intensely_human,

Cultural uniformity. There doesn’t need to be full overlap, but in the absence of a government the community needs certain beliefs and behaviors to be universal in order to remain a community.

intensely_human,

I’m able to see it. I use the wefwef web client for lemmy.

intensely_human,

I helped a friend’s daughter make paper airplanes once. She was all excited about making all sorts of new airplanes but none of them were flying.

So over the course of a few weeks I had her repeat the same basic airplane again and again, until she mastered it. At first she hated the idea of going back and doing the same airplane again, but as her folds got better and her airplanes started to fly better, she got really into it.

The grin that spread over her face when she realized she could get better at things was amazing.

And she was six. I’m not sure if an airplane a day would be appropriate for a three year old.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • uselessserver093
  • Food
  • aaaaaaacccccccce
  • test
  • CafeMeta
  • testmag
  • MUD
  • RhythmGameZone
  • RSS
  • dabs
  • KamenRider
  • TheResearchGuardian
  • KbinCafe
  • Socialism
  • oklahoma
  • SuperSentai
  • feritale
  • All magazines