If the thing you're working with is actually some sort of configuration data then this is probably a good approach. If not then building your own integration to transmit the data is probably better.
... ended up answering my own question in less time than it took to write it up. Solution (which might just be a workaround) is to force the content-type on the response to be application/octet-stream:
do inst.stream.SetAttribute("ContentDisposition","attachment; filename="""_inst.stream.GetAttribute("FileName")_"""")
do inst.stream.SetAttribute("ContentType","application/octet-stream")
set%response.Redirect = "%25CSP.StreamServer.cls?STREAMOID="_..Encrypt(inst.stream.GetStreamId())
@Eduard Lebedyuk it depends on the caller. In a CI process I could imagine doing different error handling for failed compilation vs. failed unit tests, this would be a way to signal those different modes of failure.
I've taken/seen approaches that are more shell-centric vs. more ObjectScript-centric which would be a driver for this being useful. With the package manager it's generally simpler to wrap things in <Invoke> or resource processors and then call IRIS with individual zpm commands (i.e., load then test) from CI. For some of my pre-package manager CI work we've had a big ObjectScript class that wraps all the build logic, running different sorts of tests, etc. In this case it would be useful to indicate the stage at which the failure occurred.
Regardless, $System.Process.Terminate is simpler to manage than "flag files" for sure, which would be the next best alternative. (IIRC in old days, maybe just pre-IRIS, there were Windows/Linux differences in $System.Process.Terminate's behavior, and that led us to use flag files.)
There would still need to be some enforcement of the super parent being the only node with a NULL parent (and the point here is that the unique index wouldn't do that). Also finding all of the top-level nodes (assuming we could have multiple independent trees) would be a slightly more complicated.
then reinstalling, then reenabling SourceControl.Git.Extension as the source control class for the namespace.
Ultimately something funky is going on with SQL stats collection. Given a bit more info it might be possible to mitigate the issue in the git-source-control package. Happy to connect sometime to discuss/troubleshoot.
From a diagnostic perspective, I think the things to do (which we would do on such a call) would be: * Force single-process compilation: Do $System.OBJ.SetQualifiers("/nomulticompile") * Running a fancy zbreak command: zbreak *%objlasterror:"N":"$d(%objlasterror)#2":"s ^mtempdbg($i(^mtempdbg))=%objlasterror" * Force single-process load of the package (zpm "install git-source-control -DThreads=0")
Then look at the contents of ^mtempdbg to figure out where our errant %Status is coming from and go from there.
Where these third-party apps are mostly reporting tools, it could make sense to set up a Reporting Async mirror with read-only databases. That would handle the "clients shouldn't be able to insert/update/delete" issue and protect your main instance from rogue queries. (And these clients would only be allowed to connect to the reporting async.)
For the record, actually having a page with the above allows anyone with access to the page to get access to arbitrary files on the server. (see my comment below from May 18)
You need to either do very strict input validation on filepath, or (if the full path is known in the database) use %CSP.StreamServer properly with an encrypted path.
@Jonathan Wald I'd recommend this approach, but using %XML.Exchange (in place of XML.Element - that's from TrakCare) and first having your persistent classes all extend %XML.Exchange.Adaptor.
go to post
Not great around edge cases, but this passes the tests at least (and might beat a $find-based approach since it avoids set commands):
go to post
If the thing you're working with is actually some sort of configuration data then this is probably a good approach. If not then building your own integration to transmit the data is probably better.
go to post
Why? This sounds like HealthShare; maybe using the Configuration Registry would be the right fit. See: https://docs.intersystems.com/hs20221/csp/docbook/DocBook.UI.Page.cls?KE...
go to post
... ended up answering my own question in less time than it took to write it up. Solution (which might just be a workaround) is to force the content-type on the response to be application/octet-stream:
do inst.stream.SetAttribute("ContentDisposition","attachment; filename="""_inst.stream.GetAttribute("FileName")_"""") do inst.stream.SetAttribute("ContentType","application/octet-stream") set %response.Redirect = "%25CSP.StreamServer.cls?STREAMOID="_..Encrypt(inst.stream.GetStreamId())
go to post
@Eduard Lebedyuk it depends on the caller. In a CI process I could imagine doing different error handling for failed compilation vs. failed unit tests, this would be a way to signal those different modes of failure.
I've taken/seen approaches that are more shell-centric vs. more ObjectScript-centric which would be a driver for this being useful. With the package manager it's generally simpler to wrap things in <Invoke> or resource processors and then call IRIS with individual zpm commands (i.e., load then test) from CI. For some of my pre-package manager CI work we've had a big ObjectScript class that wraps all the build logic, running different sorts of tests, etc. In this case it would be useful to indicate the stage at which the failure occurred.
Regardless, $System.Process.Terminate is simpler to manage than "flag files" for sure, which would be the next best alternative. (IIRC in old days, maybe just pre-IRIS, there were Windows/Linux differences in $System.Process.Terminate's behavior, and that led us to use flag files.)
go to post
This is a typical implementation.
go to post
It's not exactly straightforward. See https://github.com/intersystems-community/zpm/blob/master/src/%25ZPM/Pac... for an example of how to work around it / get the answers you need.
go to post
@Michael Davidovich , try this:
Do $System.Process.Terminate($Job,<desired error code>)
See e.g. https://github.com/intersystems-community/zpm/blob/master/src/%25ZPM/Pac...
go to post
select 1672185599,CONVERT(TIMESTAMP,CAST(1000000 * 1672185599 + POWER(2,60) As POSIXTIME))
Quite intuitive.
go to post
On further investigation, it seems that what I need just isn't in the old version I'm running.
go to post
Yes - see Security.Users (class reference) in the %SYS namespace.
go to post
Thank you Dan! The index/constraint distinction and SQL standard context are particularly helpful facts for this discussion. :)
go to post
There would still need to be some enforcement of the super parent being the only node with a NULL parent (and the point here is that the unique index wouldn't do that). Also finding all of the top-level nodes (assuming we could have multiple independent trees) would be a slightly more complicated.
go to post
Thank you for pointing this out! I saw this in docs but believe it wouldn't work for object-valued properties.
go to post
Hi @Steve Pisani - the same issue was reported via GitHub issues a little while back (https://github.com/intersystems/git-source-control/issues/137) but discussion trailed off and there wasn't any information there on the resolution.
You should always be able to upgrade zpm. I think the issue with <CLASS DOES NOT EXIST> error could be solved by running:
do ##class(%Studio.SourceControl.Interface).SourceControlClassSet("")
then reinstalling, then reenabling SourceControl.Git.Extension as the source control class for the namespace.
Ultimately something funky is going on with SQL stats collection. Given a bit more info it might be possible to mitigate the issue in the git-source-control package. Happy to connect sometime to discuss/troubleshoot.
From a diagnostic perspective, I think the things to do (which we would do on such a call) would be:
* Force single-process compilation: Do $System.OBJ.SetQualifiers("/nomulticompile")
* Running a fancy zbreak command:
zbreak *%objlasterror:"N":"$d(%objlasterror)#2":"s ^mtempdbg($i(^mtempdbg))=%objlasterror"
* Force single-process load of the package (zpm "install git-source-control -DThreads=0")
Then look at the contents of ^mtempdbg to figure out where our errant %Status is coming from and go from there.
go to post
Where these third-party apps are mostly reporting tools, it could make sense to set up a Reporting Async mirror with read-only databases. That would handle the "clients shouldn't be able to insert/update/delete" issue and protect your main instance from rogue queries. (And these clients would only be allowed to connect to the reporting async.)
go to post
For the record, actually having a page with the above allows anyone with access to the page to get access to arbitrary files on the server. (see my comment below from May 18)
You need to either do very strict input validation on filepath, or (if the full path is known in the database) use %CSP.StreamServer properly with an encrypted path.
go to post
GitHub repo is here - https://github.com/SeanConnelly/CloudStudio
go to post
This is WICKED COOL.
go to post
@Jonathan Wald I'd recommend this approach, but using %XML.Exchange (in place of XML.Element - that's from TrakCare) and first having your persistent classes all extend %XML.Exchange.Adaptor.