go to post Eduard Lebedyuk · Dec 13, 2019 There are already prebuilt docker containers, just pull it from docker hub: docker pull intersystemscommunity/irispy:latest The binaries are available from the releases page.
go to post Eduard Lebedyuk · Dec 12, 2019 I completely agree, and to get to standard installing mechanism for USERS, we need to zpm-enable as many existing projects as possible. To enable these projects we need to simplify the zpm-enabling, leveraging existing code if possible (or not preventing developers from leveraging the existing code). I think allowing developers to use already existing installers (whatever form they may take) would help with this goal.
go to post Eduard Lebedyuk · Dec 12, 2019 Great article! I have some questions: Are there any Interoperability metrics? How do I add my own custom metrics?
go to post Eduard Lebedyuk · Dec 12, 2019 I completely support inclusion of projections. ObjectScript Language allows execution of arbitrary code at compile time through three different mechanisms: Projections Code generators Macros All these instruments are entirely unlimited in their scope, so I don't see why we need to prohibit one way of executing code at compilation. Furthermore ZPM itself uses Projections to install itself so closing this avenue to other projects seems strange.
go to post Eduard Lebedyuk · Dec 12, 2019 Great to hear that! If you're interested in performance, upgrading to 2016.2+ would help tremendously with JSON processing due to the addition of dynamic objects. Furthermore, upgrading to IRIS 2019.1.1 would add %JSON.Adaptor which simplifies JSON (de)serialization of normal objects.
go to post Eduard Lebedyuk · Dec 12, 2019 Here's how. 1. Create a hardlink to the original database in a new empty folder. 2. Add this new database. 3. Mount new database as readonly. 4. Map package from this new database. This way you can access code for RW (in original namespace/database) and RO (in hardlink database). Just tried this setup and it worked.
go to post Eduard Lebedyuk · Dec 11, 2019 I have found a workaround. 1. Create task class as usual. 2. Create this subclass extending (1) /// Run it daily but it would actually run only on Dates Class util.CustomDatesTask Extends util.BaseTask { Parameter TaskName = "BaseTask (random dates)"; /// Comma separated Dates in YYYY-MM-DD format /// Example: 2019-12-11,2020-01-17,2020-02-11,2020-03-10,2020-04-09,2020-05-12 Property Dates As %VarString; /// Check that Dates is valid Method %OnValidateObject() As %Status { #dim sc As %Status = $$$OK set sc = ##super() quit:$$$ISERR(sc) sc try { set dates = $lfs(..Dates) } catch ex { set sc = ex.AsStatus() set sc = $$$ADDSC($$$ERROR($$$GeneralError, "Incorrect Dates value: " _ ..Dates), sc) } quit:$$$ISERR(sc) sc for i=1:1:$ll(dates) { try { set date = $lg(dates, i) set temp = $zdh(date, 3) } catch ex { set sc = ex.AsStatus() set sc = $$$ADDSC($$$ERROR($$$GeneralError, "Incorrect Date value: " _ date), sc) } quit:$$$ISERR(sc) } quit sc } Method OnTask() As %Status { #dim sc As %Status = $$$OK set dates = $lfs(..Dates) set curDate = $zd($h, 3) if $lf(dates, curDate) { // Execute the task set sc = ##super() } quit sc } } The advantage is that schedule is easy to set as a task config property. Drawback is that logs would be created for each day.
go to post Eduard Lebedyuk · Dec 11, 2019 For example: set maxrows = 1000 set currentrow = 0 while (ind '= ""){ set row = ^CacheTemp(repid,"MAIN",ind) if currentrow>maxrows { set currentrow = 0 // swap files } set currentrow = currentrow + 1 use filemain write row,! ; Get next row index for MAIN report set ind = $order(^CacheTemp(repid,"MAIN",ind)) } close filemain Are you by chance exporting SQL queries to CSV? If so it can be done automatically: do ##class(%SQL.Statement).%ExecDirect(,"select * from ...")).%DisplayFormatted(100, filename)
go to post Eduard Lebedyuk · Dec 10, 2019 You need to specify pFormat parameter, it defaults to aceloqtw, where: 1-9 : indent with this number of spaces (4 is the default with the 'i' format specifier) a - output null arrays/objects b - line break before opening { of objects c - output the Caché-specific "_class" and "_id" properties d - output Caché numeric properties that have value "" as null e - output empty object properties i - indent with 4 spaces unless 't' or 1-9 l - output empty lists n - newline (lf) o - output empty arrays/objects q - output numeric values unquoted even when they come from a non-numeric property s - use strict JSON output - NOTE: special care should be taken when sending data to a browser, as using this flag may expose you to cross site scripting (XSS) vulnerabilities if the data is sent inside <script> tags. Zen uses this technique extensively, so this flag should NOT be specified for jsonProviders in Zen pages. t - indent with tab character u - output pre-converted to UTF-8 instead of in native internal format w - Windows-style cr/lf newline In your case explicitly remove c: aeloqtw. This is quoted from documentation. Additionally if you want to output json to the current device it would be better to use %WriteJSONFromObject - it has the same arguments, except stream, so there's no extra object and io redirect costs: $$$TOE(tSC, ##class(%ZEN.Auxiliary.jsonProvider).%WriteJSONFromObject(,Store,,,,"aeloqtw")
go to post Eduard Lebedyuk · Dec 9, 2019 Depends on a coding style. I prefer not to use/propagate exceptions, but rather %Status.
go to post Eduard Lebedyuk · Dec 5, 2019 Check Fileserver project for a demo of uploading and downloading streams/files.
go to post Eduard Lebedyuk · Dec 5, 2019 It's a custom psudo-random day each month. How to reschedule task on completion on a custom date?
go to post Eduard Lebedyuk · Dec 5, 2019 For a second one, check documentation, "Nonexistent Table" section.
go to post Eduard Lebedyuk · Dec 4, 2019 Thanks, Marc. Found useful index: write ##class(Ens.Config.Item).NameExists(##class(Ens.Director).GetActiveProductionName(),"HostName")
go to post Eduard Lebedyuk · Dec 4, 2019 You need to index columns used in conditions, for the specified query: etatTitre numRemise If there are less than 6400 possible values you can use bitmap indices. Start with individual indices (so the number of indices equals the number of condition columns).
go to post Eduard Lebedyuk · Dec 3, 2019 Can/will moving to an async style of API help this? No. long strings aren't enabled in our instance(s) of Cache You should enable long strings. That's been the default for years. Anyway, use streams for response processing, they ignore string limits altogether.
go to post Eduard Lebedyuk · Dec 3, 2019 Check Web Gateway timeouts, specifically Server Response Timeout setting. Also Web Server can impose additional limitations. That said, I'd advice you to move to async style of API, here's how. Currently you have one call, say /GetData and it takes 10 minutes. Split it into 2 calls: /StartTask - JOBs a task (GetData) and returns GUID (child pid in the most simple case) /GetTask/:GUID - returns current JOB status and if it's done returns the data Here's a sample ASYNC REST broker. This will save you a lot of problems down the line.