Member 2495
15 entries

Contributor to projects:
The Total Library
Mariana Soffer (F, 45)
Buenos Aires, AR
Immortal since Feb 16, 2010
Uplinks: 0, Generation 3

My blog
Ask me anything
I am an artificial intelligence researcher, studied in California a Master in Information Science and specialized in Genetic research there. Currently I am doing research on NLP (natural language processing), particularly in the opinion mining area. I am also interested in neuroscience, Buddhism, literature, music, anthropology among other things.
  • Affiliated
  •  /  
  • Invited
  •  /  
  • Descended
  • Mariana Soffer’s favorites
    From folkert
    On to the syntactical...
    From CoCreatr
    The we among us
    From notthisbody
    To understand is to...
    From Xaos
    The Aesthetic Ground (part...
    From Allbeit
    Language that has no words
    Recently commented on
    From CoCreatr
    Overview Effect
    From Mariana Soffer
    Mind, cognition,...
    From Mariana Soffer
    Why do people play social...
    From Mariana Soffer
    Buddhist roots and...
    From Mariana Soffer
    Personal Information...
    Mariana Soffer’s projects
    The human species is rapidly and indisputably moving towards the technological singularity. The cadence of the flow of information and innovation in...

    The Total Library
    Text that redefines...
    Now playing SpaceCollective
    Where forward thinking terrestrials share ideas and information about the state of the species, their planet and the universe, living the lives of science fiction. Introduction
    Featuring Powers of Ten by Charles and Ray Eames, based on an idea by Kees Boeke.

    What is the meaning of meaning? We can view meaning at two levels. First, it is a cognitive process whereby we make sense of the stream of information that assails us in each moment. At a higher level, deep meaning is what we seek in life and looks for answers to such spiritual questions such as 'Why are we here?'

    Here are some of the existing academic theories about how we make sense of the world, understanding meaning.

    1. Constructivism: We try to make sense of the world by making use of constructs, which are perceptual categories that we use when evaluating things.

    2. Framing: A frame is the combination of beliefs, values, attitudes, mental models, and so on which we use to perceive a situation. We effectively look through this frame in the way we would look through tinted spectacles. The frame significantly effects how we infer meaning and hence understand the situation.

    3. Focusing effect: When we are making judgments, we tend to weigh attributes and factors unevenly, putting more importance on some aspects and less on others.This is typically due to factors such as stereotyping and schemes that we use that bring certain factors to mind and downplay others.

    4. Schema: A schema is a mental structure we use to organize and simplify our knowledge of the world around us. We have schemas about ourselves, other people, mechanical devices, food, and in fact almost everything.

    5. Personal constructs: People develop internal models of reality, called constructs in order to understand and explain the world around them in the same way that scientists develop theories. They are developed based on observation and experimentation. Constructs thus start as unstable conjecture, changing and stabilizing as more experience and proof is gained. Constructs are often defined by words, but can also be non-verbal and hard to explain.

    6. Symbolic interaction: People act based on symbolic meanings they find within any given situation. We thus interact with the symbols, forming relationships around them. The goals of our interactions with one another are to create shared meaning.

    7. Objectification: Complex ideas are, almost by definition, difficult to understand. To help us make sense of them, we turn them into concrete images. There are three processes by which objectification is done, giving them physical properties, turn the ideas into pictures and turn the idea into a person.

    8. Speech act: Getting a glass of water is an action. Asking someone else to get you one is also an act.When we speak, our words do not have meaning in and of themselves. They are very much affected by the situation, the speaker and the listener.

    9. Social interaction: In order for people in groups to talk with one another, they need a system of common understanding, in particular of concepts and ideas that are outside of 'common' understanding or which have particular meaning for that group. Words thus become imbued with special meaning within particular social groups.

    10. Story models: One way in wich we explain the world around us is to create stories about it. In particular when we are face with complex situations, we will pick out what seems to be key elements and then turn these into a story.
    Mon, Mar 8, 2010  Permanent link
    Categories: theory, Ontology, meaning, simbol
    Sent to project: The Total Library
      RSS for this post
      Promote (9)
      Add to favorites (2)
    Synapses (4)

    In many languages, Greek and Latin roots constitute an important part of the scientific vocabulary. This is especially true for the terms referring to fields of science. For example, the equivalent words for mathematics, physics, chemistry, geology, and genealogy are roughly the same in many languages. As for computer science, numerous words in many languages are from American English, and the vocabulary can evolve very quickly. An exception to this trend is the word referring to computer science itself, which in many European languages is roughly the same as the English informatics: German: Informatik; French: informatique; Spanish, Italian, and Portuguese: informática; Polish: informatyka.

    We live in the age of information. It pours upon us from the pages of newspapers and magazines, radio loudspeakers, tv and computer screens. The main part of this information has the form of natural language texts. Even in the area of computers, a larger part of the information they manipulate nowadays has the form of a text. It looks as if a personal computer has mainly turned into a tool to create, proofread, store, manage, and search for text documents. Our ancestors invented natural language many thousands of years ago for the needs of a developing human society. Modern natural languages are developing according to their own laws, in each epoch being an adequate tool for human communication, for expressing human feelings, thoughts, and actions.

    For the last two centuries, humanity has successfully coped with the automation of many tasks using mechanical and electrical devices, and these devices faithfully serve people in their everyday life. In the second half of the twentieth century, human attention has turned to the automation of natural language processing. People now want assistance not only in mechanical, but also in intellectual efforts.

    We need resources for NLP, the problem is that most of them are in English (such as wordnet and General Enquirer), and only just a few in the other languages. Lexical and ontological resources are fundamental for NLP. This puts non-English speakers in a serious disadvantage.

    The most-used language on the Internet according to Wikipedia is English. Although the total number of native English speakers in the world is about 322 millions, which is only around one fifth of the total internet users; the amount of English web content approaches 80%.

    Generally speaking, when a language has got the position of a universal language, the position tends to be affirmed and extended by itself. Since "everyone" knows and uses English, people are almost forced to learn English and use it, and learn it better.

    Besides the importance of the Internet grows rapidly in all fields of human life, including not only research and education but also marketing and trade as well as entertainment and hobbies. This implies that it becomes more and more important to know how to use Internet services and, as a part of this, to read and write English.

    But English is changing fast too. There is no area of the culture that collision's more intensely than that, for the web has changed English more radically than any invention since paper, and much faster. According to Paul Payack, who runs the Global Language Monitor, "there are currently 988,974 words in the English language, with thousands more emerging every month". By his calculation, English will adopt its one millionth word in late November. To put that statistic another way, for every French word, there are now ten in English.

    So far from debasing the language, the rapid expansion of English on the web may be enriching the mother tongue. Like Latin, it has developed different forms that bear little relation to one another: a speaker of Hinglish (Hindi-English) would have little to say to a Chinglish speaker. But while the root of Latin took centuries to grow its linguistic branches, modern non-standard English is evolving at fabulous speed. The language of the internet itself, the cyberisms that were once the preserve of a few web boffins, has simultaneous expanded into a new argot of words and idioms: Ancient or Classic Geek has given way to Modern Geek.
    Mon, Mar 1, 2010  Permanent link

    Sent to project: Polytopia
      RSS for this post
      Promote (5)
      Add to favorites
    Synapses (3)

    Philosophers have long wondered about the connection between metaphor and thought::

    "We believe that we know something about the things themselves when we speak of trees, colors, and flowers, he wrote, and yet we possess nothing but metaphors for things, metaphors which correspond in no way to the original entities". - Niezche

    "Inevitable clash of metaphors in all writing shows only too well that language may subvert or exceed an author's intended meaning". - Derrida

    "A metaphor is often indispensable to express a concept (or meaning) for which words just do not exist in the language. Entire domains (spheres of knowledge such as anatomy and psychology) are mapped in other domains for lack of appropriate words". - Michel Breal

    "Metaphors are markers of the roots of thought itself. They are the main mechanisms through which we comprehend abstract concepts and perform abstract reasoning. Abstract thought would be meaningless without bodily experience. People think with their brains and their brains are part of their bodies as well". - Lakoff and Johnson

    "I think that metaphor really is a key to explaining thought and language. The human mind comes equipped with an ability to penetrate the cladding of sensory appearance and discern the abstract construction underneath - not always on demand, and not infallibly, but often enough and insightfully enough to shape the human condition. Our powers of analogy allow us to apply ancient neural structures to newfound subject matter, to discover hidden laws and systems in nature, and not least, to amplify the expressive power of language itself". - Steven Pinker

    When we say someone is a warm person, we do not mean that they are running a fever. When we describe an issue as weighty, we have not actually used a scale to determine this. These phrases are metaphorical-they use concrete objects and qualities to describe abstractions like kindness or importance, we use them so often that we hardly notice them.

    Nowadays cognitive scientists have begun to see the basic metaphors that we use all the time not just as turns of phrase, but as keys to the structure of thought. By taking these everyday metaphors as literally as possible, psychologists are upending traditional ideas of how we learn, reason, and make sense of the world around us.

    They also suggest that much of what we think of as abstract reasoning is in fact a sometimes awkward piggybacking onto the mental tools we have developed to govern our body’s interactions with its physical environment. Put another way, metaphors reveal the extent to which we think with our bodies. “The abstract way we think is really grounded in the concrete, bodily world much more than we thought” says John Bargh.

    Several studies about the relation between body and metaphor have been done, in one of them subjects were asked to hold a cup of either iced or hot coffee, not knowing it was part of the study, then a few minutes later asked to rate the personality of a person who was described to them. The hot coffee group, it turned out, consistently described a warmer person—rating them as happier, more generous, and more caring - than the iced coffee group. The effect seems to run the other way also.

    Research about “where metaphor is grounded” is also being performed. It shows that It is not grounded in logic, nor in literary theory. There is no purely literal language in terms of which metaphor may be evaluated and objectively assessed. In the fields ranging from cognitive psychology to social anthropology, metaphors are currently subject to extensive analysis, but the findings can only be partial, and relative to the discipline involved. What is becoming clearer is that metaphors - like linguistic theory - are rooted in the beliefs, practices and intentions of language users.
    Fri, Feb 26, 2010  Permanent link

    Sent to project: The Total Library
      RSS for this post
      Promote (6)
      Add to favorites (2)
    Synapses (2)

    Information overload
    It is great to have access to huge amounts of information, but since we are not reading faster than before, we can not take advantage of this new situation. Therefore the need of a discipline that help human beings deal with all that data is fundamental.

    Natural language processing is the process of building computational models for understanding natural language. It studies the problems of automated generation and understanding of natural human languages. NLP includes natural-language-generation systems that convert information from computer databases into normal human language and natural-language-understanding systems that convert samples of human language into more formal representations that are easier for computer programs to manipulate.

    NLP also studies the information contained in human generated texts, along with its language structure.NLP is a multidisciplinary field, which studies artificial intelligence techniques, multivariate statistics, linguistics and any other domain that can be used to process, generate or interpret language with computers.

    NLP Facts
    -The Turing test is a proposal for a test of a machine's ability to demonstrate intelligence. Described by Alan Turing in the 1950 paper "Computing Machinery and Intelligence," it proceeds as follows: a human judge engages in a natural language conversation with one human and one machine, each of which tries to appear human. All participants are placed in isolated locations. If the judge cannot reliably tell the machine from the human, the machine is said to have passed the test. In order to test the machine's intelligence rather than its ability to render words into audio, the conversation is limited to a text-only channel such as a computer keyboard and screen. This test was the first mainstream experiment related to NLP.
    -Text is the largest repository of human knowledge and is growing quickly, there are emails, news articles, web pages, chat archives, scientific articles, insurance claims, customer complaints letters, transcripts of phone calls, technical documents, government documents, patent portfolios, court decisions, contracts, and so on.

    -Nowadays we have access to huge amounts of information, much more than in the past decades, one of the problems with this is that we are not reading any faster than before, therefore we can not take full advantage of this new situation. NlP tries to optimize the human usage of information.

    -Dealing with natural language is a difficult task. We need to understanding multiple disciplines including multivariate statistics, learning algorithms, clustering, hidden Markov models and part of speech tagging. We need to have knowledge about language, grammar, ontology and folksonomy.

    -Processing of a huge amount of data in a limited amount of time is required so special algorithms are needed. We generally apply algorithms that have low computational cost or algorithms that allow reducing the amount of computational processing needed by pre-processing the data we have. To do this there are techniques for reducing the size of the text by extracting stop words, removing words that appear too often and also words that appear very few times.

    -The applications of NLP include answering queries, identifying spam, recognizing what is the main theme of a document, grouping similar texts, obtaining the main keywords of a document, detecting syntactic errors and identifying the secondary themes of a document.
    Sun, Feb 21, 2010  Permanent link

    Sent to project: The Total Library
      RSS for this post
      Promote (5)
      Add to favorites (2)
    Create synapse

    Every moment of awareness is a pile of interpretations all in superposition. A single state of mind is layered with harmonics of meaning - yet somehow remains one experience - Susan Blackmore

    Reading and understanding language is a skill that most people take for granted. Processing language in the brain is very complex and entails many variables. Most language is processed in the left hemisphere and the right hemisphere processes visual and motor activities - States Stanislas Dehaene.

    "When we look at a text photons are bouncing off those black squiggles and lines — the letters in the particular sentence — and colliding with a thin wall of flesh at the back of your eyeball. The photons contain just enough energy to activate sensory neurons, each of which is responsible for a particular plot of visual space on the hole image. The end result is that, as you stare at the letters, they become more than mere marks on a page. You begin to read "- Says Jonah Lehrer.


    • Seeing the letters is just the start of the reading process. Although our eyes are focused on the letters, we learn to ignore them. Instead, we perceive whole words, chunks of meaning. Once we become proficient at reading, the precise shape of the letters — not to mention the arbitrariness of the spelling — doesn't even matter, which is why we read word, WORD, and WoRd the same way.
    • Until now most assumed that when we read both eyes look at the same letter of a word concurrently. But it was found that our eyes look at different letters in the same word and then combine the different images through a process known as fusion. We were able to clearly show that we experience a single, very clear and crisp visual representation due to the merging of the two different images from each eye.
    • Language tends to be stored in the brain to be processed in audio format, so besides reading the text we automatically convert it to speech in our own heads. After that the process of making sense takes place.
    • Studies have shown that when a word is checked against the storehouse of words in the brain - whether it is a written word or a word-sound - only the main part of the word is checked first, and then the ending is processed separately. For example, 'sing', 'singing' and 'singer' would all be checked against the base word 'sing'.
    • Once we recognized the printed words we need to make sense out of them. Understanding how meaning arises from those words is of the most challenging tasks in cognitive sciences.
    • More on making sense and meanings can be found here and here.
    • There is an ongoing debate whether the new kind of reading experience provided by internet is beneficial or not. Some interesting articles are worth exploring: Is Google making us stupid and How is Google making us smarter. It would be interesting to incorporate the last scientific findings about how or brain reads in order to draw new and more accurate conclusions.
    Tue, Feb 16, 2010  Permanent link
    Categories: Language, understanding, read
    Sent to project: The Total Library
      RSS for this post
      Promote (6)
      Add to favorites (1)
    Synapses (4)