Files
chromebrew/lib/downloader.rb
Satadru Pramanik, DO, MPH, MEng 6b35c08b2e Enable Cached Building on GitHub Actions— webkit2gtk_4_1 → 2.50.1 (#13001)
* Refactor and update webkit2gtk_4_1

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Add arm patch.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust env options

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Add x86_64 build.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust build settings.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust arm build options.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* lint

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust arm build.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* lint

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust g++ in build.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Add cache_build plumbing.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Add NESTED_CI detection plumbing to see if we are running in a container on GitHub Actions.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust download options for cached builds.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust timed kill to kill cmake.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust triggering of cache_build.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Cleanup output.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Update cached build hash verification.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Rubyize #{build_cachefile}.sha256 write.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust documentation of cache_build trigger.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Also kill all ruby processes after finishing cache_build.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Make cached build download info more useful.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Add --regenerate-filelist option.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Fix downloader.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Try newer git commit.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* remove arm patch.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust hash checking for build downloads.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Add message for checksum calculation since that can take a while.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Add cached build restart code block.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Add max build time to build workflow.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* fixup buildsystems

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Set workflow max build time to 5.5 hours.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Indicate architectures for build in build workflow title.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust cached build uploading.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust workflow naming.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust installs after build.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust cached build logic.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* webkit => 2.50.1

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust zstd options.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Move CREW_CACHE_DIR to /tmp in GitHub Action containers.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust build cache location.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* revert crew const variable changes.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust CREW_KERNEL_VERSION for CI usage.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Exclude @pkg.no_source_build? packages from cached builds.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Always create CREW_CACHE_DIR.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Clean up remnant @extract_dir folders from download command.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Adjust permissions in workflow.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Sync up workflows.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* lint

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Add x86_64 binaries

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Cleanup workflows.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* Do not use build cache if package binary exists.

Signed-off-by: Satadru Pramanik <satadru@gmail.com>

* webkit: Package File Update Run on linux/amd64 container.

* webkit: Package File Update Run on linux/arm/v7 container.

---------

Signed-off-by: Satadru Pramanik <satadru@gmail.com>
Co-authored-by: satmandu <satmandu@users.noreply.github.com>
2025-10-14 09:06:32 +00:00

224 lines
8.0 KiB
Ruby

require 'io/console'
require 'open3'
require 'uri'
require_relative 'const'
require_relative 'color'
require_relative 'convenience_functions'
require_relative 'progress_bar'
require_relative 'require_gem'
require_gem('ptools')
begin
require 'securerandom'
# resolv-replace is no longer needed with ruby 3.4
require 'resolv-replace' if Gem::Version.new(RUBY_VERSION) < Gem::Version.new('3.4.0')
require 'net/http'
rescue RuntimeError => e
# hide the error message and fallback to curl if securerandom raise an error
if e.message == 'failed to get urandom'
Object.send(:remove_const, :CREW_USE_CURL)
CREW_USE_CURL = true
else
puts "Error is #{e}".lightred
raise
end
end
def downloader(url, sha256sum, filename = File.basename(url), no_update_hash: false, verbose: false)
# downloader: wrapper for all Chromebrew downloaders (`net/http`,`curl`...)
# Usage: downloader <url>, <sha256sum>, <filename::optional>, <verbose::optional>
#
# <url>: URL that points to the target file
# <sha256sum>: SHA256 checksum, verify downloaded file with given checksum
# <filename>: (Optional) Output path/filename
# <verbose>: (Optional) Verbose output
#
puts "downloader(#{url}, #{sha256sum}, #{filename}, #{verbose})" if verbose
uri = URI(url)
# Make sure the destination dir for the filename exists.
FileUtils.mkdir_p File.dirname(filename)
if CREW_USE_CURL || !ENV['CREW_DOWNLOADER'].to_s.empty?
# force using external downloader if either CREW_USE_CURL or ENV['CREW_DOWNLOADER'] is set
puts "external_downloader(#{uri}, #{filename}, #{verbose})" if verbose
external_downloader(uri, filename, verbose: verbose)
else
case uri.scheme
when 'http', 'https'
# use net/http if the url protocol is http(s)://
puts "http_downloader(#{uri}, #{filename}, #{verbose})" if verbose
http_downloader(uri, filename, verbose: verbose)
when 'file'
# use FileUtils to copy if it is a local file (the url protocol is file://)
if File.exist?(uri.path)
return FileUtils.cp(uri.path, filename)
else
abort "#{uri.path}: File not found :/".lightred
end
else
# use external downloader (curl by default) if the url protocol is not http(s):// or file://
puts "external_downloader(#{uri}, #{filename}, #{verbose})" if verbose
external_downloader(uri, filename, verbose: verbose)
end
end
# Verify with given checksum, using the external sha256sum binary so
# we do not load the entire file into ruby's process, which throws
# errors with large files on 32-bit architectures.
puts "Calculating checksum for #{filename}...".lightblue
sha256sum_out, _stderr, _status = Open3.capture3("sha256sum #{filename}")
calc_sha256sum = sha256sum_out.split[0]
unless (sha256sum =~ /^SKIP$/i) || (calc_sha256sum == sha256sum)
if CREW_FORCE && !no_update_hash
pkg_name = @pkg_name.blank? ? name : @pkg_name
puts "Updating checksum for #{filename}".lightblue
puts "from #{sha256sum} to #{calc_sha256sum}".lightblue
puts "in #{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb .".lightblue
system "sed 's/#{sha256sum}/#{calc_sha256sum}/g;w #{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb.new' #{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb && mv #{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb.new #{CREW_LOCAL_REPO_ROOT}/packages/#{pkg_name}.rb"
else
FileUtils.rm_f filename
warn 'Checksum mismatch :/ Try again?'.lightred, <<~EOT
Filename: #{filename.lightblue}
Expected checksum (SHA256): #{sha256sum.green}
Calculated checksum (SHA256): #{calc_sha256sum.red}
EOT
exit 2
end
end
rescue StandardError => e
warn e.full_message
# fallback to curl if error occurred
puts "external_downloader(#{uri}, #{filename}, #{verbose})" if verbose
external_downloader(uri, filename, verbose: verbose)
end
def get_http_response(uri)
uri = URI(uri) if uri.is_a?(String)
ssl_error_retry = 0
# open http connection
Net::HTTP.start(uri.host, uri.port, {
max_retries: CREW_DOWNLOADER_RETRY,
use_ssl: uri.scheme.eql?('https'),
min_version: :TLS1_2,
ca_file: SSL_CERT_FILE,
ca_path: SSL_CERT_DIR
}) do |http|
return http.get(uri)
end
rescue OpenSSL::SSL::SSLError
# handle SSL errors
ssl_error_retry += 1
ssl_error_retry <= 3 ? retry : raise
end
def http_downloader(uri, filename = File.basename(url), verbose: false)
# http_downloader: Downloader based on net/http library
ssl_error_retry = 0
# open http connection
Net::HTTP.start(uri.host, uri.port, {
max_retries: CREW_DOWNLOADER_RETRY,
use_ssl: uri.scheme.eql?('https'),
min_version: :TLS1_2,
ca_file: SSL_CERT_FILE,
ca_path: SSL_CERT_DIR
}) do |http|
http.request(Net::HTTP::Get.new(uri)) do |response|
case
when response.is_a?(Net::HTTPSuccess)
# Response is successful, don't abort
when response.is_a?(Net::HTTPRedirection) # follow HTTP redirection
puts <<~EOT if verbose
* Follow HTTP redirection: #{response['Location']}
*
EOT
redirect_uri = URI(response['Location'])
# add url scheme/host for redirected url based on original url if missing
redirect_uri.scheme ||= uri.scheme
redirect_uri.host ||= uri.host
return send(__method__, redirect_uri, filename, verbose: verbose)
else
abort "Download of #{uri} failed with error #{response.code}: #{response.msg}".lightred
end
# get target file size (should be returned by the server)
file_size = response['Content-Length'].to_f
downloaded_size = 0.0
# initialize progress bar
progress_bar = ChromebrewProgressBar.new(file_size)
if verbose
warn <<~EOT
* Connected to #{uri.host} port #{uri.port}
* HTTPS: #{uri.scheme.eql?('https')}
*
EOT
# parse response's header to readable format
response.to_hash.each_pair { |k, v| warn "> #{k}: #{v}" }
warn "\n"
end
progress_bar_thread = progress_bar.show # print progress bar
# read file chunks from server, write it to filesystem
File.open(filename, 'wb') do |io|
response.read_body do |chunk|
downloaded_size += chunk.size # record downloaded size, used for showing progress bar
progress_bar.set_downloaded_size(downloaded_size, invalid_size_error: false) if file_size.positive?
io.write(chunk) # write to file
end
ensure
# stop progress bar, wait for it to terminate
progress_bar.progress_bar_showing = false
progress_bar_thread.join
end
end
end
rescue OpenSSL::SSL::SSLError
# handle SSL errors
ssl_error_retry += 1
ssl_error_retry <= 3 ? retry : raise
end
def external_downloader(uri, filename = File.basename(url), verbose: false)
# external_downloader: wrapper for external downloaders in CREW_DOWNLOADER (curl by default)
# default curl cmdline, CREW_DOWNLOADER should be in this format also
# %<verbose>s: Will be substitute to "--verbose" if #{verbose} set to true, otherwise will be substitute to ""
# %<retry>: Will be substitute to #{CREW_DOWNLOADER_RETRY}
# %<url>s: Will be substitute to #{url}
# %<output>s: Will be substitute to #{filename}
# i686 curl throws a "SSL certificate problem: self signed certificate in certificate chain" error.
# Only bypass this when we are using the system curl early in install.
@default_curl = File.which('curl')
curl_cmdline = ARCH == 'i686' && @default_curl == '/usr/bin/curl' ? 'curl %<verbose>s -kL -# --retry %<retry>s %<url>s -o %<output>s' : 'curl %<verbose>s -L -# --retry %<retry>s %<url>s -o %<output>s'
# use CREW_DOWNLOADER if specified, use curl by default
downloader_cmdline = CREW_DOWNLOADER || curl_cmdline
return system(
format(downloader_cmdline,
{
verbose: verbose ? '--verbose' : '',
retry: CREW_DOWNLOADER_RETRY,
url: uri.to_s,
output: filename
}), exception: true
)
end