Aplilcada

Text 1:

Applied linguistics: A twenty first century discipline (Grabe, 2004)

→ The text says that over the years, the term applied linguistics has been defined and interpreted in different ways.

1950→ A.L. the term was meant to reflect the insights of structural and functional linguistics that could be applied directly to Second Language teaching.

1960→ A.L continued to be associated with the application of linguistics to language teaching. Also it became involved in matters of language assessment, language policies, and the new field of SLA focusing on learning rather than on teaching). So, there was a reinforcement of SLA teaching as A.L. and it began to emerge as a language-centered problem solving enterprise.

1970→ Expansion as a discipline that addresses real world language based problems. Although language teaching remained central to A.L., it also took into its domain the subfields of language assessment, SLA, multilingualism, language policy and planning and language teaching training.

→ The notion that applied linguistics is driven first by real world language problems rather than by theoretical explorations of language knowledge and (L1) language development is what set the field apart from both formal linguistics and later from Sociolinguistics, with its own emphasis on language description of social variation in language use (typically minus the application to language problems.

Perspective shift in A.L

1980→ These trends evolved during 1980. The central issue remained the need to address language issuesand problems as they occur in the real world. Thus, definitions of A.L. in the 1980s emphasized both the range of issues addressed and the types of disciplinary resources used in order to work on language problems. During this time A.L. extended in a systematic way beyond language teaching and language learning issues to encompass language assessment, language policy and planning, language technology, etc.

Allof this was well documented in the 10 years of the journals of AILA Review, Annual Review of Applied Linguistics, Applied Linguistics and International Journal of Applied Linguistics.

1990→  A common trend was to view A.L as incorporating many subfield, and drawing on many supporting disciplines in addition to linguistics (anthropology; education; English studies; modern languages;political sciences; psychology, etc). A.L. evolved still further during the 1990s and 2000s, breaking away from the common framing mechanism of the 1980s.



1960-1990→ Generative linguistics dominated linguistics landscape: Chomskean linguistics almost defined linguistics, at least in USA. But then on, A.L. have been turning back to more cognitive and descriptive approaches to language knowledge and theories of language representation that  have more realistic applicability to the sorts of language issues explored by A.L.

→ According to Grabe Linguistics, viewed from this larger perspective, is still central to the overwhelming majority of A.L. areas of inquiry that are generally recognized as falling under the umbrella of discipline of A.L.

He also states that A.L. must draw on knowledge bases of phonetics, phonology, morphology, syntax, semantics, pragmatics, and written discourse because they are relevant to and A.L. issue.

→ For A.L. research, the shift to discourse analysis , descriptive data analysis, and interpretation of language data in their social/cultural settings all indicate a shift in valuing observable language data over theoretical assumptions about what should be count as data.

Trends and perspectives in the 1990s and 2000s

1°→ In A.L. research in language teaching, language learning and language teacher education is now placing considerable emphasis on notions of language awareness, attention and learning, “focus on form” for language learning,language from dialogic interaction, patterns of teacher-student interaction, task-based learning, and teacher as researcher through action research. An emphasis for language teacher education has been to move to engaging teachers in the practice of action research.

→ A second emphasis on A.L. has been the role for critical studies: critical awareness, critical discourse analysis, critical pedagogy, students right, practise and ethics in language assessment. Although it has not showed so much application, it is expected that the next decade continue this debate.

→ A third emphasis is on a language uses in academic, disciplinary, and professional settings. It examines ways in which language is used by participants and in texts in various academic, professional, and occupational settings. Also it emphasises how language can create obstacles for those who are not aware of appropriate discourse rules and expectations (ESP) .



→ A fourth emphasis centers on descriptive (usually discourse) analyses of language in real settings and the possible application of analysis in corpus linguistics, register variation, and genre variation. Other important applications of corpus linguistics include more teacher-learner directed resources. 

→ A fifth emphasis in A.L. research addresses multilingualism and bilingual interaction in school, community, and work and in professional settings or policy issues at regional and national levels because bilingual people outnumber monolingual people, and because this bilingualism is associated with the need to negotiate life situations with other cultural language groups. Multilingualism covers issues such as bilingual education, migrations and language policies.

→ A sixth emphasis focuses on the changing discussion in language testing and assessment: validity remains a major theme for language testers. Emphases on technology applications, ethics in assessment, innovative research methodologies, the roles of standardized assessment, standards for professionalism, and critical language testing are all language assessment and, by extension A.L.

→ A seventh emphasis focuses on the resources and perspectives provided by neurolinguistics and brain studies associated with language learning and language use.  Significant advances in the relations between brain functioning and language learning suggest that research insights from neuro linguistics may soon become too important to ignore. This emphasis will become and important sub-area of applied linguistics within a decade.

Definition of Applied Linguistics according to Grabe

Grabe quotes Widdowson (1998,2000,2004,2005) who has argued that A:L. is not an interdisciplinary discipline as much as mediating field or domain between the theoretical plane of linguistics and language knowledge on the one hand and its application to problems to problems that arise in a number of real world settings.

However, Grabe says that he has defined A.L. as a practice-driven discipline that addresses language-based problems in real world contexts. COnsequently it remains reasonable to see A.L. as a discipline that engages interdisciplinary resources (including linguistic resources) to address real world-language problems.

Conclusion

A.L. will need to know more about computer technologies, statistical applications, sociocultural influences on research, and ways to analyze language data.

→ A. L. will direct more attention to issues of motivation, attitudes, and affect because those factors are potentially influence many language based-problems.



Text 2:

An overview of Applied Linguistics (Schmitt & Celce-Murcia, 2010)

Definition of Applied Linguistics

→ A.L. is using what we know about a)language, and b)how it is learned and c)how it is used, in order to achieve some purpose or solve problem in the real world.

→ The dominant application has always been the teaching and learning of second or foreign languages (L2). In some countries an L2 is a common denominator or lingua franca. English is the main second language being studied around the world.

→ Traditionally, the primary concern of A.L. has been second language acquisition theory, second language pedagogy and the interface between the two.

The development of Applied Linguistics (Early history)

→ The linguistic development in Ancient Greek has to do with Plato’s and Aristotle’s contributions; they contributed to the design of  a curriculum beginning with good writing (grammar), then moving on to effective discourse (rhetoric) and culminating in the development of a dialectic  to promote a philosophical  approach to life.

→ In 1755 Samuel Johnson published his Dictionary of the English Language. It had the effect of standardizing English spelling , which until that time was variable.

→ Robert Lowth published an influential grammar where he prescribed what correct grammar should be. He did not have any linguistic background, so he based his English grammar on a classical Latin model, even though the 2 languages are organized in different ways. So, English was described by a linguistic system which was borrowed from Latin.



Applied Linguistics during the Twentieth Century

During the 20th century there was an acceleration of change in linguistic description and pedagogy, where some movements influenced the field and were replaced by new ones as they appeared. 

1° Grammar-translation model (Plötz)

→ At the beginning of the century, second languages were taught using this model which was codified in the 19th century by Karl Plötz.

→ A typical lesson would typically have 1 or 2 new grammar rules, a list of vocabulary items and some practise examples to translate from L1 into L2 or vice versa, so it emphasised accuracy in grammar rules.

→ The content focused on reading and writing literary materials.

→ Problems: it focused on the ability to analyse language but not how to use it, and reading and writing did not promote an ability to communicate orally in the target language.

2° Direct Method

→ It emphasized exposure to oral language, with listening and speaking as the main skills. Meaning was related directly to the target language, without translation.

→ It imitated how mother tongue is learnt naturally (1°listening, 2°speaking, 3°reading and writing).

→ It banished the use of L1 in the classroom.

→ Problems: it required teachers to be highly proficient in the target language, which was not always possible. Also it mimicked the L1 learning, but did not take into account the differences between L1 and L2 acquisition

3° Reading Method (Michael West, UK)

→ He was interested in increasing learner´s exposure to language through reading, so he promoted reading skills through vocabulary management.

→ He controlled the number of words that appear in the texts, so it reduced the lexical load for readers. It was part of the approach called “Vocabulary control movement”.

→ The 3 methods mentioned above, continued to hold sway until World War II, but  their weaknesses became obvious as the American military found itself short of people who were conversationally fluent in foreign languages. It needed a way to training soldiers in oral language.



4° Audiolingualism (Army method)

→ American Structuralists stepped into the gap and developed a programme which borrowed from the Direct Method (listening/speaking). It drew its rationale from Behaviorism (dominant psychological theory), which said that language learning was a result of habit formation.

→ “Good” language habits were reinforced, such as close attention to pronunciation, intensive oral drilling and memorization (they did not learn through analysis of the target language).

→ These students were highly motivated and their success was dramatic.

5° UG of Chomsky 

→ Supplanting the behaviourist idea of habit-formation, language was now seen as governed by cognitive factors, in a particular set of abstract rules which were assumed to be innate.

→ Chomsky suggested that children are born with an understanding of the way language work, which was referred to as Universal Grammar (UG). They would know the underlying principles of a language and their parameters, thus would only need enough exposure to a language.

→ This parameter setting would require less exposure than habit formation route, and so appeared a more convincing argument for how children acquire language so quickly.

6°Communicative competence of Hymes

→ Hymes emphasized that language competence consist of more than just being able to form grammatically correct sentences but also to know when and where to use these sentences and to whom.

→ This helped to swing the focus from language correctness (accuracy) to appropriacy..

7° Systemic Functional grammar of Halliday (1973)

→ It was an alternative to Chomsky’s approach, in which language was seen not as something exclusively internal to a learners, but rather as a means of functioning in society. He identified 3 types of function

1.Ideational (telling people facts or experiences)

2.Interpersonal (maintaining personal relationships with people)

3.Textual (expressing connections and organization within a text)

→ So, this was a more communicative type of pedagogy




.



8° Council of Europe attempted to create a Europe-wide language teaching system

Language is constructed through functions (what people want to do with the language) and notions (what people want to convey) Therefore, the second language system is based on a survey of L2 learners’ needs.


9° Monitor Theory    (i+ 1) (Krahsen, 1980)

→ Krashen´s theory of acquisition focused on attention on the role of input.This theory posited that a second language was mainly unconsciously acquired through exposure to “comprehensive input” rather than being learnt through explicit exercises, that it required a focus on meaning rather than form, and that a learner’s emotional state can affect this acquisition (affective filter).

→ Pedagogical implications: classroom should supply a rich source of language exposure that was meaning based and understandable, always including some elements just beyond the current level of learners.

→ This methodology emphasized the use of language for meaningful communication  (CLT: communicative language teaching).The focus was on learner fluency rather than their grammatical accuracy where students can be taught non- language related subjects (history, biology, etc).

→ Problems: although students become fluent in the L2 language, they continued to make certain persistent grammatical errors, even after many years of instruction.

→ During the 1980s tests were evaluated according to 3 principal criteria

1.Validity: did the test really measured what it was supposed to measure?

2.Reliability: did the test perform consistently from one administration to the next?

3.Practicality: was the test practical to give and mark in particular setting?

Technology was advancing throughout  the century: personal computers made sophisticated language programs available to the individual usr (learners, teacher or researcher). It opened the door to computer-assisted language learning (CALL), where learners could work on individual computers truly at their own place.

Sociocultural theory (Vygotsky)

It emphasized individual-social integration by focusing on the relationship between the socio cultural endowment (inter personal) and the biological endowment (intra) out of which emerges the human.

It is through social interaction with other that humans develop their language cognition. Most language use is co-constructed with others and not simply the product of one individual alone.

Conclusion

→ The last 40 years have been seen a move towards viewing language in much more integrative and holistic terms



Text 3:

Final Frontiers in AL – David Crystal (2010)

·The author establishes that the frontiers in AL are ever widening; unanswerable questions continue to arise.

·Gricean notion of conversation; Four Maxims that underlying the efficient cooperative use of language: quality, relevance, quantity and manner.

·According to Crystal, actual language is full of lies, irrelevancies, ramblingness and ambiguity. However, what happens on the Internet is quite different.

·Internet where interactional anonymity has released conflicting notions of truth as norm, this phenomenon is known as “spoofing and trolling, where messages are sent with the intention of causing communicative disruption. Therefore, the notion of “conversational turn” needs is fundamental. Also there is a need for an “informed internet pragmatics”

·“Language and culture” issues only explained by semantics and pragmatics, but there is a need for a third domain that would embrace the cultural comprehension, with its differences realized through language: “Global English doesn’t give you global understanding, once we understood the culture, we know the meaning of words”

Global English: new varieties of English and regional standars

·A model of linguistically mediated cultural difference is recommended, since we will be able to recognize several types of context, each of which that makes a different kind of demand on the English learner.

·The author concerns the need for a descriptive linguistics, which is not just stylistically informed and pragmatically aware, but it has a full sociolinguistic dimension including an explicit and comprehensive framework of cultural distinctiveness.

·According to Pit Corder, “is not a question of learning to produce utterances which are acceptable, they must also be appropriate

·As a conclusion, there are no final frontiers!



Text 4:

Sociolinguistics (Llamas & Stochwell, 2010)

Sociolinguistics is a field-work discipline, a shared characteristic with applied linguistics; which means that there is no introspection, no intuitive conclusions, nor impressionistic evaluation involved.

Most general definition: the study of language in society.

More specific: the study of linguistics indicators of culture and power. Focus on language and emphasize the social force of language events in the world. This allows sociolinguists to use the tools of linguistics (grammar, vocabulary, corpus linguistics, phonology, etc.) and also the influences of ethnicity, gender, ideology, and social rank on language events. Sociolinguists have an “ethically-involved” position, meaning that they have a special responsibility to use their privileged knowledge to influence the direction of, for example, government language policies, educational practices, etc.

Keywords:

→ Idiolect – sociolect. Sociolinguistics is more interested in sociolects: suggesting patterns and frameworks by which sociolects seem to operate. Social patterns are made explicit.

→ Standard vs. non-standard variation: Nations choose a “proper” variety of a particular language. Process of standardization: this “standard” variety is promoted by the educational system, the government, the media etc.  Codification: the “proper” variety is promoted through grammar books and dictionaries. Prestige.

→ Non-standard variations: are treated as incorrect or poor. They are stigmatized.

The factors of prestige and stigmatization depend on the speakers’ attitudes towards the varieties. People’s attitude to their own language often affects the form of that language.

→ Dialect: the characteristic patterns of words and word orders of a regional variety of a language. (Lexico-grammar)

→ Accent: pattern of pronunciation of a language variety.

→ Speech community: the way people speak often serves to define them as a group. It also can include non-linguistic factors, such as age, nationality, gender, town, etc. (social values-Identity)

→ Language variation, linguistic variable: any single feature of language which could be realized by different choices. (Grammatical, lexical, phonological, discoursal, linguistic variation: code-switching, diglossia, pidgins, creoles, etc.)

This language variation is correlated with social factors. (Linguistic variables cannot be isolated from the social context in which they may occur).

→ Social factors: geographical and social mobility, gender and power, age, audience, identity, social network relations.



Text 5:

Psycholinguistics: Kees de Bot & Judith Kroll (2010)

Psycholinguistics is the study of the cognitive processes that support the acquisition and use of language. This scope includes language performance under normal circumstances and when it breaks down (brain damage).

→ Focus in L1: language acquisition in children, research on adult comprehension and production.

→ Focus in L2: bilinguals (research on the language processing of individuals who are actively using more than 1 language. The term in the text does not make any difference in proficiency)

Bilinguals outnumber monolinguals in the world’s population. Bilinguals provide a genuinely universal account of the cognitive mechanism that underlies language performance.

Three basic assumptions:

L2 learners and proficient bilinguals rely on similar cognitive mechanisms.

These mechanisms are generally universal across languages.

The same cognitive resources are universally available to all learners.

Cognitive models constructed by psycholinguists to characterize the representations and processes that underlie language:

By predicting specific language behavior, psycholinguists aim to capture all aspects of language use. Goal: have a model that describes how language is processed in our brains.

Levelt: (model designed mostly for monolingual speakers) his “speaking model” is intended to describe the process of language production from the development of communicative intentions to the articulation of the sounds through a process of different components:

1.Conceptualizer: (our thinking, not linguistic in nature) our communicative intentions are turned into something that can be expressed in human language. Utterances are planned on the basis of the meanings to be expressed.

2.Formulator: isolated words and meanings are turned into sentences that are translated accordingly into sounds by a third component:



3.The articulator

In terms of lexis, this model can be explained as the following: we first select words (lexical items) on the basis of the meanings we want to express. Then, by the activation of these “lexical items”, syntactic procedures are triggered, that lead to sentence formation.

Lexical items consist of two parts: lemma (meaning and syntax) and lexeme (morphological and phonological information).

Surface structure 

Three levels: conceptual level, where all the information is stored. The lemma level, which holds the semantic information required to match the conceptual and syntactic information necessary to arrive at a surface structure. Thus the lemma can be said to be the link between meaning and form.

However, this relation between lemma and lexeme do not always lead to retrieval of the specific lexeme, or the lexeme is not retrieved as a whole. The lexeme is not a ready-made template, but it consists of a phonological frame in which phonological segments are inserted

(Tip -of the tongue phenomena)

Poulisse mentions the following factors to be taken into account in a “bilingual speaker model”.

L2 knowledge is typically incomplete: L2 speakers have fewer words and rules at their disposal than L1 speakers.

L2 speech is more hesitant and contains more errors depending on the level of proficiency of the learners: Cognitive skills theories such as Schneider, or Anderson’s (ACT) stress the importance of the development of automatic processes that are difficult to acquire and hard to unlearn. Less automaticity means that more attention has to be paid to the execution of specific lower-level tasks (such as pronunciation).

L2 speech often carries traces of the L1: L2 speakers have a fully developed L1 at their disposal, and may switch to their L1 either deliberately (motivated switches to express group membership) or unintentionally (‘performance’ switches, when an L1 word is accidentally accessed instead of an intended L2 word).

According to Poulisse, this “code switching” is similar to substitution and slips in monolingual speech. Also, L2 speech contains traces of the L1, which are due to transfer or cross-linguistic influence. Following with Poulisse, she argues that the lack of automaticity in L2 speakers can be handled with the existing monolingual production models (problems with L1 traces in L2 speech).



However, Paradis argues that neither switches to the L1 nor cross-linguistic influence phenomena call for adaptations of existing models. In terms of processing, cross-linguistic influence phenomena cannot be distinguished clearly from code-switching phenomena: both result from the working of the production system in an individual speaker.

Following this idea, in the field of psycholinguistics, code-switching and keeping languages apart are different aspects of the same phenomenon. Paradis has proposed the “sub-set hypothesis”: words (or syntactic rules or phonemes) from a given language forms a sub-set of the total inventory. Each sub-set can be activated independently. The sub-sets are formed and maintained by the use of words in specific settings: a specific given language or code-switching as a norm. This means that bilingual speakers have stores for lemmas, lexemes, syntactic rules, morpho-phonological rules and elements and articulatory elements which are not different from those of monolingual speakers.

The sub-set theory may explain how languages in bilinguals may be kept apart, but not how the choice for a given languages is made.


Language choice

Return to Levelt’s model: in speaking, the most crucial step is the matching of chunks from the pre-verbal message with the meaning part of lemmas, because here the transition from conceptualization to language-specific coding takes place.

Following the model, The lemma consists of three parts: a semantic specification (in lexical retrieval a single concept can temporarily activate more than one semantically related lemma, suggesting that the lemma store is organized according to semantic principles), syntactic information (when a lemma is activated, its particular syntactic environment is defined as well), and a pointer to a particular lexeme (the lexemes contains the phonological specifications if a lemma and the morphological make up. This relation between lemma-lexeme is not entirely clear).

This is a process where choices have to be made. Poulisse argues that “language” is one of the features used in the selection process. For example, the lemma ‘boy’ contains the semantic features of ‘male’, ‘young’. But for a bilingual speaker who has English as one of his languages, the lemma ‘boy’ needs to contain information about which language belongs to, and this information will also need to match the language cue in the pre-verbal message.

Language production in L1 and L2

The text describes investigations on the planning of utterances in real time. According to the text, even proficient bilinguals are faster to produce language in their L1 than in L2. One possibility is that bilingual speakers are slower to access the phonology of L2 than L1 and therefore they are slower on any production task in L2.

Another alternative is that bilingual speakers are not only slower to speak the L2, but also it is harder to select the L2 for output. L2 lemmas may be more weakly activated that the corresponding L1 lemmas (more vulnerable).

Green’s ‘inhibitory control model’: L1 lemmas are suppressed to allow bilinguals to speak words in L2.

But there is evidence for cross-language semantic interference. This suggests that lemmas in both languages are active during speaking.

Costa’s ‘language specific model’: bilinguals establish a mental firewall of sorts in which lexical alternatives in both languages become active but attention is directed only to the selection of candidates in the target language cued for selection.

On the other hand, Green’s ‘inhibitory control model’, is a competition-for-selection model, in which all activated candidates compete for selection, requiring eventual inhibition of alternatives in the unintended language.

The non-selective nature of lexical access

When the L1 was required to be active, there was significant interference for the interlingual homographs relative to their control ‘words’. The result suggests that when the non-target language was sufficiently active, the alternative reading of the word was also available. Both languages are active.

Even skilled bilinguals appear to be unable to control the consequences of activating information in the unintended language. One implication is that learners may be even more vulnerable to the consequences of the effects of L1 lexical form on processing the L2.

Developing lexical proficiency in a Second Language

Potter’s results suggest that L2 learners at earlier stages of L2 development show lexical mediation whereby L1 translation equivalents were activated to facilitate access to concepts.

The costs of code switching

It is defined as the use of more than one language in an utterance. Psycholinguistic approach: focuses on the mechanisms of language processing involved.

Switching costs (time and effort): according to Meuter and Allport, switching does cost time, and that takes more time to switch from the weaker language into a stronger language than the other way around. Their explanation was that more effort is needed to inhibit the stronger language and that it takes accordingly more time and effort to reactivate it again.

Cs in dialogue: Kootstra takes dialogue as the basic unit of analysis; Kootstra et al. argue that CS typically takes place in interaction and that the study of CS in dialogue is ecologically more valid than taking a monologue perspective. They gather data using the “confederate paradigm” in which interactional data are gathered in a setting in which one of the interactants is actually manipulating the conversation to prime certain types of language use.

The cognitive consequences of bilingualism

Bilingualism appears to provide a measure of protection against the normal effects of cognitive aging. (The hypothesis is that a life spent negotiating cross-language competition fine tunes a set of cognitive skills that benefit the ability to select targeted information).

Implications

Two factors determine accessibility of linguistic elements, in particular in non-balanced bilinguals and language learners:

The information must have been acquired and stored, and it must be accessible in time.

Much of our linguistic knowledge is by definition unstable: words and rules are not always equally available, availability depends on similarity to the L1 or other languages

It is only through extensive contact with that word in a variety of contexts that it will gradually develop a full, close to native, set of links.



In the last decades, the field of Forensic Linguistics has been a discipline of interest for scholars, lawyer, professionals who work in Applied Linguistics and graduate students. In spite of being an unexplored and, at the same time a multidisciplinary discipline, this growing practice has been universally understood as the interface between law and language. In fact, according to Gibbons and Turell (2014), Forensic Linguistics has many aspects. These aspects or areas of study can be divided into written language of the law, particularly the language of legislation; spoken legal discourse, especially the language of court proceedings and police questioning; the social justice issues that emerge from the written and spoken language of the law; the provision of linguistics evidence, which can be divided into evidence on identity/authorship, and evidence on communication; the teaching and learning of spoken and written legal language; and legal translation and interpreting. (Gibbons, Turell 2014).

So, as it can perceive, forensic linguistics is a multidimensional discipline. However, this paper will focus on forensic linguistics as a tool that provides evidence especially in terms of identity and authorship; concepts that are related to plagiarism. It is important to mention that plagiarism is also a multidimensional area that involves for example the contexts in which it is produced (for example copyrights, translation, etc.) and the setting and activities in which it occurs such as essays, novels, songs, etc.). Also, it is important to mention that for the purpose of this research, author’s rights are part of copyright law when talking about Intellectual property.



  • teachers can modify tasks to bring the best out of learners with preferred styles
  • categories of style preferences to understand L learning process
    • sensory/perceptual
    • cognitive
    • personality-related
  • steps for instructions based on styles and strategies: help learners to be more conscious and systematic when using their strategies and to add new ones
    • raise awareness about learning style preferences
    • find out which styles the learners favour
    • modelling new strategies
    • provide a rationale for strategy use
    • give exercises for them to put strategies into practice
    • encourage to enhance current strategies
    • encourage use of strategies even when it means taking risks
    • point out cultural differences in the use of strategies
    • organize sharing sessions
  • teacher should keep learners’ motivational level high, for these he/she has to keep strategies in mind and check their use