Hi,

I had to prepare a suitable file

Previous case:  Open Param "RU" /UNDEFINED ignores  line terminators

 
USER>open file:("RU":1000000)
 
USER>use file read x use 0 write $l(x)
164405
USER>write $e(x,*-30,*)
e></xs:complexType></xs:schema>

USER&gt;close file

With open Param "RS" /STREAM  line terminators are honored

USER&gt;open file:(<span style="color:#c0392b;"><strong>"RS"</strong></span>:1000000) s l=0
 
USER&gt;use file read x set l=l+$l(x) use 0 write $l(x)," ",l
4126 4126
USER&gt;use file read x set l=l+$l(x) use 0 write $l(x)," ",l
18433 22559
USER&gt;use file read x set l=l+$l(x) use 0 write $l(x)," ",l
61497 84056
USER&gt;use file read x set l=l+$l(x) use 0 write $l(x)," ",l
80343 164399
USER>write $e(x,*-30,*) e></xs:complexType></xs:schema> USER>close file

The difference in lenght results from skipped line terminators.
 

There is no option 'NO_JSON' or similar for strings.
so you have to do it by hand.

Assumption according to your description the basic table  looks like this: (except for Name)

select ID, Name, Options from Whatever.Whatever

ID
Name
Options
1
Zucherro,Michelle Q.
{"Color":"Green","Count":4}
2
Paraskiv,Alexandra E.
{"Color":"Purple,""Count":6}
3
Ramsay,Jules T.
{"Color":"White""Count":8}
4
Grabscheid,Julie K.
{"Color":"Orange","Count":2}
5
Edwards,Mark S.
{"Color":"Red","Count":1}

Then this might do the trick:
You manually mask out the critical characters first and mask it in after JSON Processing.
No help by the system just your own fate.

SELECT top 5 ID, REPLACE(REPLACE(REPLACE(
JSON_OBJECT('ID':ID,'Name': Name,'Options': $TRANSLATE(Options,'{}"','()^') )
,'"(','{')
,')"','}')
,'^','"')

FROM Whatever.Whatever

 
1
{"ID":1,"Name":"Zucherro,Michelle Q.","Options":{"Color":"Black","Count":7}}
2
{"ID":2,"Name":"Paraskiv,Alexandra E.","Options":{"Color":"Red","Count":1}}
3
{"ID":3,"Name":"Ramsay,Jules T.","Options":{"Color":"Purple","Count":6}}
4
{"ID":4,"Name":"Grabscheid,Julie K.","Options":{"Color":"Green","Count":4}}
5
{"ID":5,"Name":"Edwards,Mark S.","Options":{"Color":"White,""Count":8}}

 

Not funny but working

You could as well compose your 'personalized' JSON result in a Classmethod and project it as SqlProcedure 

I'm really surprised by this discussion.
Especially having actual numbers. What should mean 6400 with just 15000 rows in total? Sorry, I oppose!
It's a matter of selectivity.  If you can collect 2% of your records or more by a single value, then a BITMAP makes sense.
Even EXTENTTbitmap that filters Exists or Not falls into this rule. Though this isn't really property based.

1)  YES, you can! 
And it will not affect stability and usability. Though understanding COS is definitely an advantage in understanding what is going on. Similar to all other DBs: Understanding concepts and internals is always a benefit. Other DBs are just not as open to investigation and not as flexible doing the "undoable".

2a) Importing and running a Caché DB into IRIS works for 98% at least.
For the remaining 2%, ISC engineers are very open to assist you and solve the issue.

2b) Converting COS to anything else depends mostly on the COS code you have in hands. I know of no converter to do it for you.
As COS allows coding style that was used 40yrs back the range of styles is a very broad and unpredictable field without touching it.
#1 You depend on the quality of external documentation.
#2 You depend on inline documentation, comments, remarks in code. This can be excellent or just not existing, 
#3 You depend on how tricky the code is designed and written.
At that point just knowing COS might not be enough and even experts could get their headache on what I call "dirty coding". 
#4 ISC has experts also to read and understand old style and its side effects.
#5 You have this bright community to ask.
#6 You have excellent online training facilities to learn COS. I've done this with ~12 people over the last few years.
If they understand Objects, SQL, Java (or generic OO programming concepts) it's a matter of a few weeks to be inside COS.
And they have to be willing to break out and see something new with other limits and other possibilities other horizons.

HTH 

The key issue in a DR scenario is network performance between the instances.

You will run most likely an Async Mirror to have a reasonable distance between production and DR site.
I wouldn't suppose enough bandwidth for a sync Mirror.

The other issue is the performance of the DR site.  You require enough performance to process all the synchronization within a reasonably short delay. This is often underestimated, Production servers grow and leave their DR site behind.

Not specific to the cloud but not less important: How can you verify that the content of your DR site is really identic to your production.
For a heavy transactional operation, this can be a real tricky exercise.

And last not least if you don't train your team for a disaster situation and verify your instructions step by step at least once a year all your investment could be wasted money.

Especially this last point is skipped quite often as it means in most cases a lot of effort with no immediate ROI.
 

I oppose:

- all text coloring, fonts sizes help to express the importance of text. 

- text from right to left  is only useful for Hebrew or Arabic writing:  not my world

- special characters help a lot if you don't have them on your keyboard   ¿isn't it ? my 2 ¢

-  what's bad with smileys?  crying

I do spell checking with Grammarly. the embedded only confused me.

Source could need an improvement to wrap the text in the window. The single line display is cumbersome.

Still another approach using your original "solution" in PHP following the idea of a Micro-Service.

Instead of a mimic of what PHP might do I use it directly for this purpose.
That way more sophisticated functionalities that can be used without recoding.

I extended your test to include doubled double quotes

USER>write  %dstr
ABC Company,"123 Main St, Ste 102","Anytown, DC",10001,234-567-8901,"hi ""rcc"" was here"

USER>set reply=$$^phpCSV(%dstr) write reply,!! zwrite reply
ABC Company       123 Main St, Ste 102    Anytown, DC 10001   234-567-8901    hi "
rcc" was here
 
reply="ABC Company"_$c(9)_"123 Main St, Ste 102"_$c(9)_"Anytown, DC"_$c(9)_"10001"_$c(9)_"234-567-8901"_$c(9)_"hi ""rcc"" was here"

*

and here the code:

phpCSV(str) {       ; use PHP for conversion
#define php "............\php.exe "    ; add location of php.exe
#define pipe "|CPIPE|1"
  set file="myTest.php"
  open file:"WN" use file
  write "<?php "
        ,!,"$str='"_str_"';"
        ,!,"$strtotab = implode('\t', str_getcsv($str, ','));"
        ,!,"print_r($strtotab);"
        ,!,"?>",!
  close file
  open $$$pipe:$$$php_file
  use $$$pipe read result
  close $$$pipe
  use 0 ; write result
  quit $replace(result,"\t",$c(9))
}