utf 8 - Handling UTF-8 characters in Oracle external tables -
i have external table reads fixed length file. file expected contain special characters. in case word containing special character "göteborg". because "ö" special character, looks oracle considering 2 bytes. causes trouble. subsequent fields in files shifted 1 byte thereby messing data. has faced issue before. far have tried following solution:
changed value of nls_lang american_america.we8iso8859p1
 tried setting database character set utf-8
 tried changing nls_length_symmantic char instead of byte using alter system
 tried changing external table characterset to: al32utf8
 tried changing external table characterset to: utf-8  
nothing works. other details include:
- file utf-8 encoded
 - operating system : rhel
 - database: oracle 11g
 
any thing else might missing? appreciated. thanks!
the nls_length_semantics pertains creation of new tables.
below did fix problem.
  records delimited newline   characterset al32utf8   string sizes in characters    i.e.
alter session set nls_length_semantics = char / create table tdw_owner.sdp_tst_ext (     cost_center_code varchar2(10)     null,     cost_center_desc varchar2(40)     null,     source_client    varchar2(3)      null,     name1            varchar2(35)     null ) organization external  ( type oracle_loader    default directory dba_data_dir    access parameters     ( records delimited newline       characterset al32utf8       string sizes in characters          logfile dba_data_dir:'sdp_tst_ext_%p.log'         badfile dba_data_dir:'sdp_tst_ext_%p.bad'         discardfile dba_data_dir:'sdp_tst_ext_%p.dsc'         fields     notrim        (              cost_center_code char(10)             ,cost_center_desc  char(40)             ,source_client  char(3)             ,name1  char(35)            )     )    location (dba_data_dir:'sdp_tst.dat')  ) reject limit unlimited noparallel norowdependencies /      
Comments
Post a Comment