Serveur d'exploration sur l'opéra

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Unifying performer and accompaniment

Identifieur interne : 000288 ( PascalFrancis/Corpus ); précédent : 000287; suivant : 000289

Unifying performer and accompaniment

Auteurs : Lars Graugaard

Source :

RBID : Pascal:08-0036216

Descripteurs français

English descriptors

Abstract

A unique real time system for correlating a vocal, musical performance to an electronic accompaniment is presented. The system has been implemented and tested extensively in performance in the author's opera 'La Quintrala', and experience with its use in practice is presented. Furthermore, the system's functionality is outlined, it is put into current research perspective, and its possibilities for further development and other usages is discussed. The system correlates voice analysis to an underlying chord structure, stored in computer memory. This chord structure defines the primary supportive pitches, and links the notated and electronic score together, addressing the needs of the singer for tonal indicators' at any given moment. A computer-generated note is initiated by a combination of the singer - by the onset of a note, or by some element in the continuous spectrum of the singing - and the computer through an accompaniment algorithm. The evolution of this relationship between singer and computer is predefined in the application according to the structural intentions of the score, and is affected by the musical and expressive efforts of the singer. The combination of singer and computer influencing the execution of the accompaniment creates a dynamic, musical interplay between singer and computer, and is a very fertile musical area for a composer's combined computer programming and score writing.

Notice en format standard (ISO 2709)

Pour connaître la documentation sur le format Inist Standard.

pA  
A01 01  1    @0 0302-9743
A05       @2 3902
A08 01  1  ENG  @1 Unifying performer and accompaniment
A09 01  1  ENG  @1 Computer music modeling and retrieval : Texte imprimé : Third international symposium, CMMR 2005, Pisa, Italy, September 26-28, 2005 : revised papers
A11 01  1    @1 GRAUGAARD (Lars)
A12 01  1    @1 KRONLAND-MARTINET (Richard) @9 ed.
A12 02  1    @1 VOINIER (Thierry) @9 ed.
A12 03  1    @1 YSTAD (Sølvi) @9 ed.
A14 01      @1 Aalborg University Esbjerg, Department of Software and Media Technology, Niels Bohrs Vej 6 @2 6700 Esbjerg @3 DNK @Z 1 aut.
A20       @1 169-184
A21       @1 2006
A23 01      @0 ENG
A26 01      @0 3-540-34027-0
A43 01      @1 INIST @2 16343 @5 354000153589310160
A44       @0 0000 @1 © 2008 INIST-CNRS. All rights reserved.
A45       @0 34 ref.
A47 01  1    @0 08-0036216
A60       @1 P @2 C
A61       @0 A
A64 01  1    @0 Lecture notes in computer science
A66 01      @0 DEU
C01 01    ENG  @0 A unique real time system for correlating a vocal, musical performance to an electronic accompaniment is presented. The system has been implemented and tested extensively in performance in the author's opera 'La Quintrala', and experience with its use in practice is presented. Furthermore, the system's functionality is outlined, it is put into current research perspective, and its possibilities for further development and other usages is discussed. The system correlates voice analysis to an underlying chord structure, stored in computer memory. This chord structure defines the primary supportive pitches, and links the notated and electronic score together, addressing the needs of the singer for tonal indicators' at any given moment. A computer-generated note is initiated by a combination of the singer - by the onset of a note, or by some element in the continuous spectrum of the singing - and the computer through an accompaniment algorithm. The evolution of this relationship between singer and computer is predefined in the application according to the structural intentions of the score, and is affected by the musical and expressive efforts of the singer. The combination of singer and computer influencing the execution of the accompaniment creates a dynamic, musical interplay between singer and computer, and is a very fertile musical area for a composer's combined computer programming and score writing.
C02 01  X    @0 001B40C75
C02 02  X    @0 001D02B04
C02 03  X    @0 001B40C38
C03 01  3  FRE  @0 Acoustique audio @5 01
C03 01  3  ENG  @0 Audio acoustics @5 01
C03 02  X  FRE  @0 Recherche information @5 02
C03 02  X  ENG  @0 Information retrieval @5 02
C03 02  X  SPA  @0 Búsqueda información @5 02
C03 03  X  FRE  @0 Musique @5 03
C03 03  X  ENG  @0 Music @5 03
C03 03  X  SPA  @0 Música @5 03
C03 04  X  FRE  @0 Système temps réel @5 06
C03 04  X  ENG  @0 Real time system @5 06
C03 04  X  SPA  @0 Sistema tiempo real @5 06
C03 05  X  FRE  @0 Système information @5 07
C03 05  X  ENG  @0 Information system @5 07
C03 05  X  SPA  @0 Sistema información @5 07
C03 06  X  FRE  @0 Audition @5 08
C03 06  X  ENG  @0 Hearing @5 08
C03 06  X  SPA  @0 Audición @5 08
C03 07  X  FRE  @0 Tonie @5 09
C03 07  X  ENG  @0 Pitch(acoustics) @5 09
C03 07  X  SPA  @0 Altura sonida @5 09
C03 08  X  FRE  @0 Chanteur @5 10
C03 08  X  ENG  @0 Singer @5 10
C03 08  X  SPA  @0 Cantor @5 10
C03 09  X  FRE  @0 Acoustique musicale @5 18
C03 09  X  ENG  @0 Musical acoustics @5 18
C03 09  X  SPA  @0 Acústica musical @5 18
C03 10  X  FRE  @0 Voix @5 19
C03 10  X  ENG  @0 Voice @5 19
C03 10  X  SPA  @0 Voz @5 19
C03 11  X  FRE  @0 Structure ordinateur @5 20
C03 11  X  ENG  @0 Computer structure @5 20
C03 11  X  SPA  @0 Estructura computadora @5 20
C03 12  X  FRE  @0 Intention @5 21
C03 12  X  ENG  @0 Intention @5 21
C03 12  X  SPA  @0 Intencíon @5 21
C03 13  X  FRE  @0 Etude expérimentale @5 33
C03 13  X  ENG  @0 Experimental study @5 33
C03 13  X  SPA  @0 Estudio experimental @5 33
N21       @1 052
N44 01      @1 OTO
N82       @1 OTO
pR  
A30 01  1  ENG  @1 International Symposium on Computer Music Modeling and Retrieval @2 3 @3 Pisa ITA @4 2005

Format Inist (serveur)

NO : PASCAL 08-0036216 INIST
ET : Unifying performer and accompaniment
AU : GRAUGAARD (Lars); KRONLAND-MARTINET (Richard); VOINIER (Thierry); YSTAD (Sølvi)
AF : Aalborg University Esbjerg, Department of Software and Media Technology, Niels Bohrs Vej 6/6700 Esbjerg/Danemark (1 aut.)
DT : Publication en série; Congrès; Niveau analytique
SO : Lecture notes in computer science; ISSN 0302-9743; Allemagne; Da. 2006; Vol. 3902; Pp. 169-184; Bibl. 34 ref.
LA : Anglais
EA : A unique real time system for correlating a vocal, musical performance to an electronic accompaniment is presented. The system has been implemented and tested extensively in performance in the author's opera 'La Quintrala', and experience with its use in practice is presented. Furthermore, the system's functionality is outlined, it is put into current research perspective, and its possibilities for further development and other usages is discussed. The system correlates voice analysis to an underlying chord structure, stored in computer memory. This chord structure defines the primary supportive pitches, and links the notated and electronic score together, addressing the needs of the singer for tonal indicators' at any given moment. A computer-generated note is initiated by a combination of the singer - by the onset of a note, or by some element in the continuous spectrum of the singing - and the computer through an accompaniment algorithm. The evolution of this relationship between singer and computer is predefined in the application according to the structural intentions of the score, and is affected by the musical and expressive efforts of the singer. The combination of singer and computer influencing the execution of the accompaniment creates a dynamic, musical interplay between singer and computer, and is a very fertile musical area for a composer's combined computer programming and score writing.
CC : 001B40C75; 001D02B04; 001B40C38
FD : Acoustique audio; Recherche information; Musique; Système temps réel; Système information; Audition; Tonie; Chanteur; Acoustique musicale; Voix; Structure ordinateur; Intention; Etude expérimentale
ED : Audio acoustics; Information retrieval; Music; Real time system; Information system; Hearing; Pitch(acoustics); Singer; Musical acoustics; Voice; Computer structure; Intention; Experimental study
SD : Búsqueda información; Música; Sistema tiempo real; Sistema información; Audición; Altura sonida; Cantor; Acústica musical; Voz; Estructura computadora; Intencíon; Estudio experimental
LO : INIST-16343.354000153589310160
ID : 08-0036216

Links to Exploration step

Pascal:08-0036216

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Unifying performer and accompaniment</title>
<author>
<name sortKey="Graugaard, Lars" sort="Graugaard, Lars" uniqKey="Graugaard L" first="Lars" last="Graugaard">Lars Graugaard</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Aalborg University Esbjerg, Department of Software and Media Technology, Niels Bohrs Vej 6</s1>
<s2>6700 Esbjerg</s2>
<s3>DNK</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">08-0036216</idno>
<date when="2006">2006</date>
<idno type="stanalyst">PASCAL 08-0036216 INIST</idno>
<idno type="RBID">Pascal:08-0036216</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000288</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Unifying performer and accompaniment</title>
<author>
<name sortKey="Graugaard, Lars" sort="Graugaard, Lars" uniqKey="Graugaard L" first="Lars" last="Graugaard">Lars Graugaard</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Aalborg University Esbjerg, Department of Software and Media Technology, Niels Bohrs Vej 6</s1>
<s2>6700 Esbjerg</s2>
<s3>DNK</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
<imprint>
<date when="2006">2006</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Audio acoustics</term>
<term>Computer structure</term>
<term>Experimental study</term>
<term>Hearing</term>
<term>Information retrieval</term>
<term>Information system</term>
<term>Intention</term>
<term>Music</term>
<term>Musical acoustics</term>
<term>Pitch(acoustics)</term>
<term>Real time system</term>
<term>Singer</term>
<term>Voice</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Acoustique audio</term>
<term>Recherche information</term>
<term>Musique</term>
<term>Système temps réel</term>
<term>Système information</term>
<term>Audition</term>
<term>Tonie</term>
<term>Chanteur</term>
<term>Acoustique musicale</term>
<term>Voix</term>
<term>Structure ordinateur</term>
<term>Intention</term>
<term>Etude expérimentale</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">A unique real time system for correlating a vocal, musical performance to an electronic accompaniment is presented. The system has been implemented and tested extensively in performance in the author's opera 'La Quintrala', and experience with its use in practice is presented. Furthermore, the system's functionality is outlined, it is put into current research perspective, and its possibilities for further development and other usages is discussed. The system correlates voice analysis to an underlying chord structure, stored in computer memory. This chord structure defines the primary supportive pitches, and links the notated and electronic score together, addressing the needs of the singer for tonal indicators' at any given moment. A computer-generated note is initiated by a combination of the singer - by the onset of a note, or by some element in the continuous spectrum of the singing - and the computer through an accompaniment algorithm. The evolution of this relationship between singer and computer is predefined in the application according to the structural intentions of the score, and is affected by the musical and expressive efforts of the singer. The combination of singer and computer influencing the execution of the accompaniment creates a dynamic, musical interplay between singer and computer, and is a very fertile musical area for a composer's combined computer programming and score writing.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>0302-9743</s0>
</fA01>
<fA05>
<s2>3902</s2>
</fA05>
<fA08 i1="01" i2="1" l="ENG">
<s1>Unifying performer and accompaniment</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG">
<s1>Computer music modeling and retrieval : Texte imprimé : Third international symposium, CMMR 2005, Pisa, Italy, September 26-28, 2005 : revised papers</s1>
</fA09>
<fA11 i1="01" i2="1">
<s1>GRAUGAARD (Lars)</s1>
</fA11>
<fA12 i1="01" i2="1">
<s1>KRONLAND-MARTINET (Richard)</s1>
<s9>ed.</s9>
</fA12>
<fA12 i1="02" i2="1">
<s1>VOINIER (Thierry)</s1>
<s9>ed.</s9>
</fA12>
<fA12 i1="03" i2="1">
<s1>YSTAD (Sølvi)</s1>
<s9>ed.</s9>
</fA12>
<fA14 i1="01">
<s1>Aalborg University Esbjerg, Department of Software and Media Technology, Niels Bohrs Vej 6</s1>
<s2>6700 Esbjerg</s2>
<s3>DNK</s3>
<sZ>1 aut.</sZ>
</fA14>
<fA20>
<s1>169-184</s1>
</fA20>
<fA21>
<s1>2006</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA26 i1="01">
<s0>3-540-34027-0</s0>
</fA26>
<fA43 i1="01">
<s1>INIST</s1>
<s2>16343</s2>
<s5>354000153589310160</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2008 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>34 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>08-0036216</s0>
</fA47>
<fA60>
<s1>P</s1>
<s2>C</s2>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>Lecture notes in computer science</s0>
</fA64>
<fA66 i1="01">
<s0>DEU</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>A unique real time system for correlating a vocal, musical performance to an electronic accompaniment is presented. The system has been implemented and tested extensively in performance in the author's opera 'La Quintrala', and experience with its use in practice is presented. Furthermore, the system's functionality is outlined, it is put into current research perspective, and its possibilities for further development and other usages is discussed. The system correlates voice analysis to an underlying chord structure, stored in computer memory. This chord structure defines the primary supportive pitches, and links the notated and electronic score together, addressing the needs of the singer for tonal indicators' at any given moment. A computer-generated note is initiated by a combination of the singer - by the onset of a note, or by some element in the continuous spectrum of the singing - and the computer through an accompaniment algorithm. The evolution of this relationship between singer and computer is predefined in the application according to the structural intentions of the score, and is affected by the musical and expressive efforts of the singer. The combination of singer and computer influencing the execution of the accompaniment creates a dynamic, musical interplay between singer and computer, and is a very fertile musical area for a composer's combined computer programming and score writing.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001B40C75</s0>
</fC02>
<fC02 i1="02" i2="X">
<s0>001D02B04</s0>
</fC02>
<fC02 i1="03" i2="X">
<s0>001B40C38</s0>
</fC02>
<fC03 i1="01" i2="3" l="FRE">
<s0>Acoustique audio</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="3" l="ENG">
<s0>Audio acoustics</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Recherche information</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Information retrieval</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Búsqueda información</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Musique</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Music</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Música</s0>
<s5>03</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Système temps réel</s0>
<s5>06</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Real time system</s0>
<s5>06</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Sistema tiempo real</s0>
<s5>06</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE">
<s0>Système information</s0>
<s5>07</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG">
<s0>Information system</s0>
<s5>07</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA">
<s0>Sistema información</s0>
<s5>07</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Audition</s0>
<s5>08</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>Hearing</s0>
<s5>08</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Audición</s0>
<s5>08</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Tonie</s0>
<s5>09</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Pitch(acoustics)</s0>
<s5>09</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Altura sonida</s0>
<s5>09</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Chanteur</s0>
<s5>10</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Singer</s0>
<s5>10</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Cantor</s0>
<s5>10</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Acoustique musicale</s0>
<s5>18</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Musical acoustics</s0>
<s5>18</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Acústica musical</s0>
<s5>18</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Voix</s0>
<s5>19</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Voice</s0>
<s5>19</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Voz</s0>
<s5>19</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Structure ordinateur</s0>
<s5>20</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Computer structure</s0>
<s5>20</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Estructura computadora</s0>
<s5>20</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE">
<s0>Intention</s0>
<s5>21</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG">
<s0>Intention</s0>
<s5>21</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA">
<s0>Intencíon</s0>
<s5>21</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE">
<s0>Etude expérimentale</s0>
<s5>33</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG">
<s0>Experimental study</s0>
<s5>33</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA">
<s0>Estudio experimental</s0>
<s5>33</s5>
</fC03>
<fN21>
<s1>052</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
<pR>
<fA30 i1="01" i2="1" l="ENG">
<s1>International Symposium on Computer Music Modeling and Retrieval</s1>
<s2>3</s2>
<s3>Pisa ITA</s3>
<s4>2005</s4>
</fA30>
</pR>
</standard>
<server>
<NO>PASCAL 08-0036216 INIST</NO>
<ET>Unifying performer and accompaniment</ET>
<AU>GRAUGAARD (Lars); KRONLAND-MARTINET (Richard); VOINIER (Thierry); YSTAD (Sølvi)</AU>
<AF>Aalborg University Esbjerg, Department of Software and Media Technology, Niels Bohrs Vej 6/6700 Esbjerg/Danemark (1 aut.)</AF>
<DT>Publication en série; Congrès; Niveau analytique</DT>
<SO>Lecture notes in computer science; ISSN 0302-9743; Allemagne; Da. 2006; Vol. 3902; Pp. 169-184; Bibl. 34 ref.</SO>
<LA>Anglais</LA>
<EA>A unique real time system for correlating a vocal, musical performance to an electronic accompaniment is presented. The system has been implemented and tested extensively in performance in the author's opera 'La Quintrala', and experience with its use in practice is presented. Furthermore, the system's functionality is outlined, it is put into current research perspective, and its possibilities for further development and other usages is discussed. The system correlates voice analysis to an underlying chord structure, stored in computer memory. This chord structure defines the primary supportive pitches, and links the notated and electronic score together, addressing the needs of the singer for tonal indicators' at any given moment. A computer-generated note is initiated by a combination of the singer - by the onset of a note, or by some element in the continuous spectrum of the singing - and the computer through an accompaniment algorithm. The evolution of this relationship between singer and computer is predefined in the application according to the structural intentions of the score, and is affected by the musical and expressive efforts of the singer. The combination of singer and computer influencing the execution of the accompaniment creates a dynamic, musical interplay between singer and computer, and is a very fertile musical area for a composer's combined computer programming and score writing.</EA>
<CC>001B40C75; 001D02B04; 001B40C38</CC>
<FD>Acoustique audio; Recherche information; Musique; Système temps réel; Système information; Audition; Tonie; Chanteur; Acoustique musicale; Voix; Structure ordinateur; Intention; Etude expérimentale</FD>
<ED>Audio acoustics; Information retrieval; Music; Real time system; Information system; Hearing; Pitch(acoustics); Singer; Musical acoustics; Voice; Computer structure; Intention; Experimental study</ED>
<SD>Búsqueda información; Música; Sistema tiempo real; Sistema información; Audición; Altura sonida; Cantor; Acústica musical; Voz; Estructura computadora; Intencíon; Estudio experimental</SD>
<LO>INIST-16343.354000153589310160</LO>
<ID>08-0036216</ID>
</server>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Musique/explor/OperaV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000288 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000288 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Musique
   |area=    OperaV1
   |flux=    PascalFrancis
   |étape=   Corpus
   |type=    RBID
   |clé=     Pascal:08-0036216
   |texte=   Unifying performer and accompaniment
}}

Wicri

This area was generated with Dilib version V0.6.21.
Data generation: Thu Apr 14 14:59:05 2016. Site generation: Thu Oct 8 06:48:41 2020