2012年7月11日 星期三

Investigation of Spoken Language and Written Language in Thought From Visual and Auditory Inputs


First of all, we must separate speech and writing. We know that speech and writing are two forms of human language. Speech and writing both represent meaning [1]. They both possess language's property. To work around the concept 'language' that is a term for both speech and writing, we do not regard them as language, but as visual and auditory info, in our discussion. Apparently, the fundamental distinction between speech and writing is: speech is auditory (acoustic) info, while writing is visual info. We will start from the basic level, seeing and hearing, to find out how they become language.

Speech and writing each has two aspects, i.e. production and reception. Production (speaking and writing) is outputs of the body, reception (listening and reading) are inputs to the body. Here we consider how visual and auditory inputs influence thought, whereas production is not discussed because they result from thought.

While different types of sensory info are distinct before entering the body, they integrate with each other in thought so as association [2] is built within the same modality or between different modalities. It may be too hard to separate different info in thought because they integrate into a single carrier, the brain. Visual information is so tightly associated with other types of information that we rarely have other types of information isolated from visual information. Different types of info are not clearly separated (Kolb and Whishaw, 2003, p.179) and separation of them has rarely been emphasized. Here we do not attempt to do so. Instead, we estimate their significance in thought by the effect of sensory inputs on thought. Also, different types of sensory info are themselves defined in sensory level rather than thought level. Thus our investigation becomes more doable and realistic.

When speech and writing are considered as language, speech, rather than writing, appears the genuine form. However, the situation in sensory level is totally different. It is apparently that vision is the dominant sense. Visual info, rather than auditory info, should contribute more in thought. On the relationship between speech and writing, one mistake having been made is the dominant sense vision conforms to non-dominant hearing. We understand and explain things mostly via visual information. From the galaxy to atom, in fact what we refer to is visual info. The power of vision is taken for granted. One misconception people take for granted is to consider visual information of one object as the object itself. For example, define motion using the visual information of a moving object. Compared to vision, hearing bears very little information such that it conforms to vision. We understand other's speech is in most cases the speech can be matched to visual information.

Visual info includes text and non-text. Auditory info includes speech (sound) and non-speech (sound). From this info point of view, text differs from non-text because they have different visual forms instead of text represents language. Speech and non-speech differ because they are different sounds instead of speech represents language. Speech sounds and other sounds, such as music, all belong to auditory info.

References

Kolb, B. and Whishaw, I. Q. 2003: Fundamentals of Human Neuropsychology (5th ed.). New York: Worth Publishers.

Footnotes

[1] Here, meaning refers to all info of all modalities except speech and writing.

[2] It should be noted that this association is not just simply between one piece of picture and one sound (or other piece of sensory info). It can be between one set of visual info with one set of other info. The association can change over time.








This post was made using the Auto Blogging Software from WebMagnates.org This line will not appear when posts are made after activating the software to full version.

沒有留言:

張貼留言