When I restore a .qar file that has been generated in by our build in the cloud, I can't use Locate Design File from Timequest.
The problem is that the .qar file contains absolute paths, which exist in the cloud, but not on my local computer.
How can we generate a .qar file with relative paths?
This is the command line we use to generate the .qar file. We can't use the GUI, as this is automated:
quartus_sh --archive -use_file_set full_db -common_dir /workspace/hardware/foo_top -output foo.qar top
We're using Quartus 19.3
Normally, when you archive a project that includes files outside the project directory, the archiver copies the external files into the .qar and gives them a virtual drive letter to make the .qar portable. I don't know if the -common_dir option (which I haven't seen before) is affecting this. Have you tried creating the .qar without this option?
What you can do is have a quartus gui opened in any computer. Then project -> archive project -> try to include the files the relative path and archive it.
After that, from they, you should see the command that is being used in the Quartus itself and you can copy and make used of it.
How, exactly, do I "try to include the files the relative path"?
I don't see an option to do this...
All the files that I can see in the "add" requester, have relative paths.
When I add a custom file, then I get a command line which uses "arc_file_list_temp.tmp.txt", which, presumably lists all files.
It has absolute paths for the files that I explicitly selected and relative paths for the files that Quartus knows to include in the archive.
Info: Command: quartus_sh --archive -revision top -output /home/oyvind/Desktop/ascenium_restored/top_19_3_0_222.qar -use_file_set custom -input /home/oyvind/Desktop/ascenium_restored/arc_file_list_temp.tmp.txt -use_file_subset qsf -use_file_subset auto top
Here are the list available for the project archive.
-all_revisions - Includes all revisions of the current project in the archive.
• -auto_common_directory - Preserves original project directory structure in archive
• -common_directory / - Preserves original project directory structure in specified subdirectory
• -include_libraries - Includes libraries in archive
• -include_outputs - Includes output files in archive
• -use_file_set - Includes specified fileset in archive
If it still not work, let me know. We will have to enhance the feature on this.
We're using github pull requests to drive Google Cloud Build.
It's a fair amount to set up....
- Create pull request
- All unit-tests are run
- P&R is run (using Quartus in docker image)
- Timing checks (our own script as P&R will not return non-zero exit code even if there's no timingclosure)
- (Not in the cloud, but on a local Jenkins slave) The bitfile is uploaded to the FPGA
- (Not inthe cloud) Various tests are run on the FPGA
- The pull request is marked as correct
- Mergify on github merges the pull request
I've given it a try, I've got an account, but it seemed like a half-hearted attempt at checking the FPGA cloud box for Intel/Nimbix.
It doesn't seem to start with an understanding of what problems FPGA developers need solved, there's not much documentation, no working examples of how to integrate with e.g. github pull requests, there's no way to upload my own bit files to the Arria 10 FPGAs, etc.
Also, I would still need the Google Cloud Build to get the github plugin for pull requests.
We only have the documentation for nimbix and azure cloud
I will ask internal team to see if we do support goolge cloud and documentation.
I had requested developer to support google cloud in the near future. They will evaluate it and if it supported, we should have documentation and your problem should be resolve.