This section documents the API.
api ¶
Modules:
| Name | Description |
|---|---|
citation | |
datetime | |
irradiance | |
performance | |
plot | |
position | |
power | |
quick_response_code | |
series | |
spectrum | |
statistics | |
surface | |
tmy | |
utilities | |
citation ¶
Functions:
| Name | Description |
|---|---|
convert_to_bibtex | |
convert_to_bibtex ¶
Source code in pvgisprototype/api/citation.py
def convert_to_bibtex(citation: Dict) -> str:
""" """
bibtex = f"""
@misc{{pvgis,
title = {{{citation['title']}}},
subtitle = {{{citation['subtitle']}}},
version = {{{citation['version']}}},
author = {{{citation['contact']['name']}}},
howpublished = {{\\url{{{citation['links']['PVGIS Web Application']}}}}},
note = {{{citation['description']}}},
institution = {{{citation['contact']['name']}}},
address = {{{citation['contact']['address']['line1']}, {citation['contact']['address']['line2']}, {citation['contact']['address']['country']}}},
year = {2024}
}}"""
return bibtex
datetime ¶
Modules:
| Name | Description |
|---|---|
conversion | Timestamp relevant conversions |
datetimeindex | Date, time and zones |
helpers | |
now | |
random | Helper functions to generate random timestamps |
timezone | |
conversion ¶
Timestamp relevant conversions
Functions:
| Name | Description |
|---|---|
convert_timestamps_to_utc | |
convert_timestamps_to_utc ¶
convert_timestamps_to_utc(
user_requested_timezone: ZoneInfo | None = None,
user_requested_timestamps: (
Timestamp | DatetimeIndex | None
) = None,
) -> Timestamp | DatetimeIndex
Source code in pvgisprototype/api/datetime/conversion.py
def convert_timestamps_to_utc(
user_requested_timezone: ZoneInfo | None = None,
user_requested_timestamps: Timestamp | DatetimeIndex | None = None,
) -> Timestamp | DatetimeIndex:
""" """
if user_requested_timestamps is None:
user_requested_timestamps = Timestamp.now()
logger.debug(
f"Input time zone : {user_requested_timezone}",
alt=f"Input time zone : [code]{user_requested_timezone}[/code]",
)
utc_timestamps = user_requested_timestamps # Fallback if already UTC
# naive timestamps
if user_requested_timestamps.tz is None:
utc_timestamps = user_requested_timestamps.tz_localize(ZONEINFO_UTC)
logger.debug(
f"Naive input timestamps\n({user_requested_timestamps})\nlocalized to UTC aware for all internal calculations :\n{utc_timestamps}"
)
# timezone aware timestamps
elif user_requested_timestamps.tz != ZONEINFO_UTC:
utc_timestamps = user_requested_timestamps.tz_convert(ZONEINFO_UTC)
logger.debug(
f"Input zone\n{user_requested_timezone}\n& timestamps :\n{user_requested_timestamps}\n\nconverted for all internal calculations to :\n{utc_timestamps}",
alt=f"Input zone : [code]{user_requested_timezone}[/code]\n& timestamps :\n{user_requested_timestamps}\n\nconverted for all internal calculations to :\n{utc_timestamps}",
)
return utc_timestamps
datetimeindex ¶
Date, time and zones
By default, input timestamps will default to the Coordinated Universal Time (UTC) unless a user explicitly requests another or the system's local time and zone. Regardless, all timestamps will convert internally to UTC. The rationale behind this design decision is:
-
UTC provides an unambiguous reference point as it does not observe Daylight Saving Time (DST) which may bring in various complexities.
-
UTC is a standard used worldwide, making it a safer choice for interoperability.
-
Using UTC can avoid issues when a server/system's local time zone may not be under control.
-
While the software allows users to to specify their time zone if they wish, internally all timestamps will convert to UTC internally and only convert back to the user's time zone when displaying the time to the user.
Things to keep in mind:
From: https://blog.ganssle.io/articles/2022/04/naive-local-datetimes.html
-
The local offset may change during the course of the interpreter run.
-
You can use datetime.astimezone with None to convert a naïve time into an aware datetime with a fixed offset representing the current system local time.
-
All arithmetic operations should be applied to naïve datetimes when working in system local civil time — only call .astimezone(None) when you need to represent an absolute time, e.g. for display or comparison with aware datetimes.[3]
Read also:
Functions:
| Name | Description |
|---|---|
generate_datetime_series | Generate a fixed frequency DatetimeIndex |
generate_timestamps | Parameters |
generate_datetime_series ¶
generate_datetime_series(
start_time: str | None = None,
end_time: str | None = None,
periods: str | None = None,
frequency: str | None = TIMESTAMPS_FREQUENCY_DEFAULT,
timezone: ZoneInfo | None = None,
name: str | None = None,
) -> Timestamp | DatetimeIndex
Generate a fixed frequency DatetimeIndex
Generates a range of equally spaced timestamps wrapping over Pandas' date_range() function. The timestamps satisfy start_time <[=] x <[=] end_time, where the first and last stamps fall on the boundary of the requested frequency string. The difference between any two timestamps is specified by the requested frequency.
If exactly one of start_time, end_time, or frequency is not specified, it can be computed by the periods, the number of timesteps in the range.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
start_time | str, datetime, date, pandas.Timestamp, or period-like | The starting time (if str in ISO format), also described as the left bound for generating periods. | None |
end_time | str, datetime, date, pandas.Timestamp, or period-like | The ending time in ISO format. In Pandas described as the right bound for generating periods. | None |
periods | int | Number of periods to generate. | None |
frequency | str or DateOffset | Frequency alias of the timestamps to generate, e.g., 'h' for hourly. By default the frequency is taken from start_time or end_time if those are Period objects. Otherwise, the default is "h" for hourly frequency. | TIMESTAMPS_FREQUENCY_DEFAULT |
name | str | Name of the resulting PeriodIndex. | None |
Returns:
| Type | Description |
|---|---|
DatetimeIndex | A Pandas DatetimeIndex at the specified frequency. |
See Also
pandas.date_range Return a fixed frequency DatetimeIndex.
Notes
Of the four parameters start_time, end_time, periods, and frequency, exactly three must be specified. If frequency is omitted, the resulting DatetimeIndex will have periods linearly spaced elements between start and end (closed on both sides).
Common time series frequencies are indexed via a set of string (also referred to as offset) aliases described at https://pandas.pydata.org/pandas-docs/stable/user_guide/timeseries.html#offset-aliases`__.
Example
start_time = '2010-06-01 06:00:00' end_time = '2010-06-01 08:00:00' frequency = 'h' # 'h' for hourly generate_datetime_series(start_time, end_time, frequency) DatetimeIndex(['2010-06-01 06:00:00', '2010-06-01 07:00:00', '2010-06-01 08:00:00'], dtype='datetime64[ns]', freq=None)
Using the periods input parameter to define the number of timesteps to generate :
generate_datetime_series(start_time=start_time, periods=4, frequency=frequency) DatetimeIndex(['2010-06-01 06:00:00', '2010-06-01 07:00:00', '2010-06-01 08:00:00', '2010-06-01 09:00:00'], dtype='datetime64[ns]', freq='H')
Source code in pvgisprototype/api/datetime/datetimeindex.py
def generate_datetime_series(
start_time: str | None = None,
end_time: str | None = None,
periods: str | None = None,
frequency: str | None = TIMESTAMPS_FREQUENCY_DEFAULT,
timezone: ZoneInfo | None = None,
name: str | None = None,
) -> Timestamp | DatetimeIndex:
"""Generate a fixed frequency DatetimeIndex
Generates a range of equally spaced timestamps wrapping over Pandas'
date_range() function. The timestamps satisfy `start_time <[=] x <[=]
end_time`, where the first and last stamps fall on the boundary of the
requested ``frequency`` string. The difference between any two timestamps
is specified by the requested ``frequency``.
If exactly one of ``start_time``, ``end_time``, or ``frequency`` is *not*
specified, it can be computed by the ``periods``, the number of timesteps
in the range.
Parameters
----------
start_time : str, datetime, date, pandas.Timestamp, or period-like, default None
The starting time (if str in ISO format), also described as the left
bound for generating periods.
end_time : str, datetime, date, pandas.Timestamp, or period-like, default None
The ending time in ISO format. In Pandas described as the right bound
for generating periods.
periods : int, default None
Number of periods to generate.
frequency : str or DateOffset, optional
Frequency alias of the timestamps to generate, e.g., 'h' for hourly.
By default the frequency is taken from start_time or
end_time if those are Period objects. Otherwise, the default is "h" for
hourly frequency.
name : str, default None
Name of the resulting PeriodIndex.
Returns
-------
DatetimeIndex
A Pandas DatetimeIndex at the specified frequency.
See Also
--------
pandas.date_range
Return a fixed frequency DatetimeIndex.
Notes
-----
Of the four parameters ``start_time``, ``end_time``, ``periods``, and
``frequency``, exactly three must be specified. If ``frequency`` is
omitted, the resulting ``DatetimeIndex`` will have ``periods`` linearly
spaced elements between ``start`` and ``end`` (closed on both sides).
Common time series frequencies are indexed via a set of string (also
referred to as offset) aliases described at
<https://pandas.pydata.org/pandas-docs/stable/user_guide/timeseries.html#offset-aliases>`__.
Example
-------
>>> start_time = '2010-06-01 06:00:00'
>>> end_time = '2010-06-01 08:00:00'
>>> frequency = 'h' # 'h' for hourly
>>> generate_datetime_series(start_time, end_time, frequency)
DatetimeIndex(['2010-06-01 06:00:00', '2010-06-01 07:00:00', '2010-06-01 08:00:00'], dtype='datetime64[ns]', freq=None)
Using the periods input parameter to define the number of timesteps to generate :
>>> generate_datetime_series(start_time=start_time, periods=4, frequency=frequency)
DatetimeIndex(['2010-06-01 06:00:00', '2010-06-01 07:00:00',
'2010-06-01 08:00:00', '2010-06-01 09:00:00'],
dtype='datetime64[ns]', freq='H')
"""
# Validate input parameters --
# Can we do this with a callback and at the Context level ?
number_of_parameters = sum(
parameter is not None for parameter in [start_time, end_time, periods]
)
if number_of_parameters < 2:
error_message = (
f"Insufficient parameters to generate timestamps. "
f"User input is : start_time={start_time}, end_time={end_time}, periods={periods}. "
f"Please provide at least two or at most three out of the timestamp relevant parameters!"
)
logger.error(error_message)
raise ValueError(error_message)
elif start_time is None and end_time is None:
timestamps = Timestamp.now(tz="UTC")
else:
try:
timestamps = date_range(
start=start_time,
end=end_time,
periods=periods,
freq=frequency,
tz=timezone,
name=name,
)
except Exception as e:
logger.exception("Failed to generate datetime series.")
raise ValueError(f"Failed to generate datetime series: {str(e)}")
if timestamps.empty:
error_message = "The generated DatetimeIndex is empty! You might want to check the relevant timestamp parameters for accuracy."
logger.error(error_message)
raise ValueError(error_message)
return timestamps
generate_timestamps ¶
generate_timestamps(
data_file: Path | None,
time_offset: Timedelta | None = None,
start_time: Timestamp | None = None,
end_time: Timestamp | None = None,
periods: str | None = None,
frequency: str | None = TIMESTAMPS_FREQUENCY_DEFAULT,
timezone: ZoneInfo | None = None,
name: str | None = None,
) -> DatetimeIndex
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
data_file | Path | None | | required |
time_offset | Timedelta | None | Time delta to add to all timestamps in the DatetimeIndex. This may be especially important for instantaneous data (i.e. SATAHx solar irradiance time series data) where the solar positioning has to coincide with the acquisition of the data. | None |
start_time | Timestamp | None | | None |
Source code in pvgisprototype/api/datetime/datetimeindex.py
def generate_timestamps(
data_file: Path | None,
time_offset: Timedelta | None = None,
start_time: Timestamp | None = None,
end_time: Timestamp | None = None,
periods: str | None = None,
frequency: str | None = TIMESTAMPS_FREQUENCY_DEFAULT,
timezone: ZoneInfo | None = None,
name: str | None = None,
) -> DatetimeIndex:
"""
Parameters
----------
data_file: Path
time_offset: Timedelta
Time delta to add to all timestamps in the DatetimeIndex. This may be
especially important for instantaneous data (i.e. SATAHx solar
irradiance time series data) where the solar positioning has to
coincide with the acquisition of the data.
start_time: Timestamp
"""
if start_time == end_time:
raise ValueError(
"The `start_time` and `end_time` cannot be identical. If you intend to use a single timestamp, please specify it directly, e.g., '2121-12-22 12:21:12'."
)
# Extract timestamps from first available space-time data file
if data_file is not None:
logger.debug(
f"Retrieving timestamps from input time series data {data_file}",
alt=f"[bold]Retrieving[/bold] timestamps from input time series data [code]{data_file}[/code]",
)
if isinstance(data_file, Path | str):
timestamps = read_data_array_or_set(data_file).time # type: ignore
else:
timestamps = data_file.time # type: ignore
logger.info(
f"timestamps retrieved from {data_file} :\n{timestamps}",
alt=f"timestamps retrieved from [code]{data_file}[/code] :\n{timestamps}"
)
# Implement Me ? --------------------------------------------------- #
# #
# if check_for_duplicate_timestamps:
# if timestamps.indexes['time'].duplicated().any():
# logger.error(
# f"Duplicate timestamps detected.",
# alt= f"[red]Duplicate timestamps detected![/red]"
# )
# #
# ----------------------------------------------------- Implement Me #
if timestamps is None:
logger.error(
"No timestamps found in the provided data file!",
alt="[red]No timestamps found in the provided data file![/red]",
)
raise ValueError("Unable to extract timestamps from the data file.")
if start_time and end_time:
if start_time > end_time: # type: ignore
logger.error(
f"Timestamp {start_time=} should be later than {end_time=}!",
alt=f"[red]Timestamp {start_time=} should be later than {end_time=}![/red]",
)
raise ValueError(
f"Timestamp {start_time=} should be later than {end_time=}!"
)
# # convert back to Xarray !
# from xarray import DataArray
# timestamps = DataArray(timestamps, dims=["time"], name="time")
# Filter timestamps based on start_time and end_time
if start_time:
if start_time.tzinfo:
start_time = start_time.tz_localize(None)
if end_time:
if end_time.tzinfo:
end_time = end_time.tz_localize(None)
if start_time or end_time:
logger.debug(
f"> Slicing timestamps from {start_time} to {end_time}",
alt=f"> [bold]Slicing[/bold] timestamps from {start_time} to {end_time}",
)
timestamps = timestamps.sel(time=slice(start_time, end_time))
logger.debug(
f" : Slice of timestamps :\n{timestamps}",
alt=f" [blue]:[/blue] Slice of timestamps :\n{timestamps}",
)
if start_time and periods and not end_time:
if frequency:
timestamps = timestamps.resample(time=frequency).nearest()
timestamps = timestamps.isel(time=slice(0, periods))
elif end_time and periods and not start_time:
if frequency:
timestamps = timestamps.resample(time=frequency).nearest()
timestamps = timestamps.isel(time=slice(-periods, None))
elif start_time and end_time and periods:
logger.error(
f"Best if you provide a `start_time` OR an `end_time` along with `periods`, not both!",
alt=f"[bold]Best if you provide a[/bold] `start_time` [bold][italics yellow]or[/italics yellow] an[/bold] `end_time` [bold]along with[/bold] `periods`, [bold red]not both![/bold red]",
)
raise ValueError(
"Best if you provide a `start_time` or an `end_time` along with `periods`, not both! Else, I cannot decide which periods to return, from the start or the the end.. ;-?"
)
elif frequency and not periods and (start_time or end_time):
# resampled_timestamps = DatetimeIndex(
# timestamps.resample(time=frequency).nearest()
# )
# resampled_timestamps.intersection(timestamps)
# logger.debug(
# f"Resampled timestamps at frequency = {frequency} :\n{timestamps}",
# alt=f"Resampled timestamps at frequency = {frequency} :\n{timestamps}",
# )
logger.warning(
f"Resampling the timestamps retrieved from the data would eventually introduce new timestamps! Skipping...",
alt=f"[red on white]Resampling the timestamps retrieved from the data would eventually introduce new timestamps! Skipping...[/red on white]",
)
timestamps = DatetimeIndex(timestamps)
else:
logger.debug(
f" + Generating timestamps based on user-requested time series parameters",
alt=f" [magenta]+[/magenta] [bold]Generating[/bold] timestamps based on user-requested time series parameters",
)
timestamps = generate_datetime_series(
start_time=start_time,
end_time=end_time,
periods=periods,
frequency=frequency,
timezone=timezone,
name=name,
)
# from pandas import to_datetime
# -----------------------------------------------------------------------
# If we do the following, we need to take care of external naive time series!
# timezone_aware_timestamps = [
# attach_requested_timezone(timestamp, timezone) for timestamp in timestamps
# ]
# return to_datetime(timezone_aware_timestamps, format="mixed")
# -----------------------------------------------------------------------
logger.warning(
f" < Returning timestamps :\n{timestamps}",
alt=f" [green]<[/green] Returning timestamps :\n{timestamps}",
)
if time_offset is not None:
timestamps += time_offset
return timestamps
helpers ¶
Functions:
| Name | Description |
|---|---|
get_day_from_hour_of_year | Get day of year from hour of year. |
get_days_in_year | Calculate the number of days in a given year, accounting for leap years. |
get_days_in_years | Calculate the number of days in a given year, accounting for leap years. |
get_day_from_hour_of_year ¶
Get day of year from hour of year.
Source code in pvgisprototype/api/datetime/helpers.py
def get_day_from_hour_of_year(year: int, hour_of_year: int):
"""Get day of year from hour of year."""
start_of_year = np.datetime64(f"{year}-01-01")
date_and_time = start_of_year + np.timedelta64(hour_of_year, "h")
date_and_time = date_and_time.astype(datetime.datetime)
day_of_year = int(date_and_time.strftime("%j"))
# month = int(date_and_time.strftime('%m')) # Month
# day_of_month = int(date_and_time.strftime('%d'))
# hour_of_day = int(date_and_time.strftime('%H'))
return day_of_year
get_days_in_year ¶
Calculate the number of days in a given year, accounting for leap years.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
year | int | The year for which to calculate the number of days. | required |
Returns:
| Type | Description |
|---|---|
int | The number of days in the given year. |
Examples:
Source code in pvgisprototype/api/datetime/helpers.py
def get_days_in_year(year):
"""Calculate the number of days in a given year, accounting for leap years.
Parameters
----------
year : int
The year for which to calculate the number of days.
Returns
-------
int
The number of days in the given year.
Examples
--------
>>> get_days_in_year(2020)
366
>>> get_days_in_year(2021)
365
"""
start_date = datetime(year, 1, 1) # First day of the year
end_date = datetime(year + 1, 1, 1) # First day of the next year
return (end_date - start_date).days
get_days_in_years ¶
Calculate the number of days in a given year, accounting for leap years.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
years | DatetimeIndex | The years series for which to calculate the number of days. | required |
Returns:
| Name | Type | Description |
|---|---|---|
DatetimeIndex | The number of days series for the given years series |
Examples:
>>> get_days_in_years_series(pd.DatetimeIndex(['2000-12-22 21:12:12', '2001-11-11 11:11:11']))
Index([366, 365], dtype='int64')
Source code in pvgisprototype/api/datetime/helpers.py
def get_days_in_years(years):
"""Calculate the number of days in a given year, accounting for leap years.
Parameters
----------
years : DatetimeIndex
The years series for which to calculate the number of days.
Returns
-------
DatetimeIndex :
The number of days series for the given years series
Examples
--------
>>> get_days_in_years_series(pd.DatetimeIndex(['2000-12-22 21:12:12', '2001-11-11 11:11:11']))
Index([366, 365], dtype='int64')
"""
years = years.to_numpy().astype(int)
is_leap_year = (years % 4 == 0) & ((years % 100 != 0) | (years % 400 == 0)) # Vectorized calculation for leap years
days_in_year = np.where(is_leap_year, 366, 365)
return Index(days_in_year, dtype='int32')
now ¶
Functions:
| Name | Description |
|---|---|
now_datetime | Returns the current datetime in UTC. |
now_local_datetimezone | Get current local date and time and zone |
now_utc_datetimezone | Returns the current datetime in UTC. |
now_datetime ¶
Returns the current datetime in UTC.
Return an aware timestamp using the local system time, however defaulting to UTC timezone.
now_local_datetimezone ¶
now_utc_datetimezone ¶
Returns the current datetime in UTC.
Return an aware timestamp using the local system time, however defaulting to UTC timezone.
random ¶
Helper functions to generate random timestamps
Functions:
| Name | Description |
|---|---|
random_datetimezone | Generate a random datetime and timezone object |
random_day_of_year | Generate a random datetime and timezone object |
random_datetimezone ¶
Generate a random datetime and timezone object
Source code in pvgisprototype/api/datetime/random.py
def random_datetimezone() -> tuple:
"""
Generate a random datetime and timezone object
"""
year = datetime.now().year
month = randint(1, 12)
_, days_in_month = monthrange(year, month)
day = randint(1, days_in_month)
hour = randint(0, 23)
minute = randint(0, 59)
second = randint(0, 59)
datetimestamp = datetime(
year, month, day, hour, minute, second, tzinfo=ZoneInfo("UTC")
)
timezone_str = choice(list(available_timezones()))
timezone = ZoneInfo(timezone_str)
return datetimestamp, timezone
random_day_of_year ¶
timezone ¶
Functions:
| Name | Description |
|---|---|
attach_requested_timezone | Attaches the requested timezone to a naive datetime. Attention : Defaults to UTC if no timezone requested! |
attach_timezone | Convert datetime object to timezone-aware. |
callback_generate_a_timezone | Convert string to |
ctx_attach_requested_timezone | Returns the current datetime in the user-requested timezone. |
generate_a_timezone | |
parse_timezone | |
attach_requested_timezone ¶
Attaches the requested timezone to a naive datetime. Attention : Defaults to UTC if no timezone requested!
Source code in pvgisprototype/api/datetime/timezone.py
def attach_requested_timezone(
timestamp: Timestamp,
timezone: ZoneInfo,
) -> Timestamp | None:
"""
Attaches the requested timezone to a naive datetime. Attention : Defaults to UTC if no timezone requested!
"""
# print(f'[green]i[/green] Callback function attach_requested_timezone()')
if timestamp.tzinfo is not None:
raise ValueError(
f" [yellow]>[/yellow] The provided timestamp '{timestamp}' already has a timezone! Expected a [yellow]naive[/yellow] [bold]datetime[/bold] or [bold]Timestamp[/bold] object."
)
if timezone:
try:
# print(f'[yellow]i[/yellow] Attaching the requested zone [bold]{timezone}[/bold] to {timestamp}')
return timestamp.tz_localize(timezone)
except Exception as e:
print(
f"[red]x[/red] Failed to attach the requested timezone '{timezone}' to the timestamp: {e}!"
)
else:
zoneinfo_utc = ZoneInfo("UTC")
print(
f" [yellow]i[/yellow] Timezone not requested! Setting to [red]{zoneinfo_utc}[/red]."
)
return timestamp.tz_localize(zoneinfo_utc)
attach_timezone ¶
attach_timezone(
timestamp: Timestamp | None = None,
timezone: str | None = None,
) -> datetime | None
Convert datetime object to timezone-aware.
Source code in pvgisprototype/api/datetime/timezone.py
def attach_timezone(
timestamp: Timestamp | None = None, timezone: str | None = None
) -> datetime | None:
"""Convert datetime object to timezone-aware."""
if timestamp is None:
timestamp = Timestamp.now(tz=ZoneInfo("UTC")) # Default
if isinstance(timezone, str):
try:
tzinfo = generate_a_timezone(ZoneInfo(timezone))
timestamp = timestamp.replace(tzinfo=tzinfo)
except Exception as e:
raise ValueError(f"Could not convert timezone: {e}")
return timestamp
callback_generate_a_timezone ¶
ctx_attach_requested_timezone ¶
Returns the current datetime in the user-requested timezone.
Source code in pvgisprototype/api/datetime/timezone.py
def ctx_attach_requested_timezone(
ctx: typer.Context,
timestamp: Timestamp,
# param: typer.CallbackParam,
) -> Timestamp | None:
"""Returns the current datetime in the user-requested timezone."""
# print(f'[yellow]i[/yellow] Context: {ctx.params}')
# print(f'[yellow]i[/yellow] typer.CallbackParam: {param}')
# print(f' [yellow]>[/yellow] Executing ctx_attach_requested_timezone()')
timezone = ctx.params.get("timezone")
# print(f' [yellow]>[/yellow] User requested input parameter [code]timezone[/code] = [bold]{timezone}[/bold]')
# print(f' [green]>[/green] Callback function returns : {attach_requested_timezone(timestamp, timezone)}')
return attach_requested_timezone(timestamp, ZoneInfo(timezone))
generate_a_timezone ¶
Source code in pvgisprototype/api/datetime/timezone.py
def generate_a_timezone(timezone: ZoneInfo) -> ZoneInfo:
""" """
context_message = f"> Executing callback function callback_generate_a_timezone()"
context_message_alternative = f"[yellow]>[/yellow] Executing [underline]callback function[/underline] callback_generate_a_timezone()"
logger.debug(context_message, alt=context_message_alternative)
warning_message = warning_message_alternative = str()
if not timezone:
warning_message += (
f"No timezone requested. Assuming and setting {ZONEINFO_UTC}."
)
warning_message_alternative += f"[red]No timezone requested.[/red] [bold yellow]Assuming and setting [code]{ZONEINFO_UTC}[/code]."
logger.warning(warning_message, alt=warning_message_alternative)
timezone = ZONEINFO_UTC
if timezone == "local":
warning_message += (
f"Local timezone is requested. Retrieving it from the current system."
)
warning_message_alternative += f"[bold]Local timezone[/bold] is requested. [bold yellow]Retrieving it from the current system.[/bold yellow]"
logger.warning(warning_message, alt=warning_message_alternative)
try:
timezone = ZoneInfo(Timestamp(datetime.now().astimezone()).tzinfo)
except (ZoneInfoNotFoundError, Exception):
logger.error(
f"{x_mark} Requested zone {timezone} not found. Setting it to UTC.",
alt=f"[x_mark][red]Requested zone {timezone} not found. Setting it to [code]UTC[/code][/red].",
)
raise ValueError("The requested time zone {timezone} is not valid!")
context_message = f" < Returning object : {type(timezone)} : {timezone}"
context_message_alternative = (
f" [green]<[/green] Returning object : {type(timezone)} : {timezone}"
)
logger.debug(context_message, alt=context_message_alternative)
return timezone
parse_timezone ¶
Source code in pvgisprototype/api/datetime/timezone.py
def parse_timezone(
timezone: str,
) -> ZoneInfo | str:
""" """
context_message = f"> Executing parser function : parse_timestamp()"
# context_message += f'\ni Callback parameter : {typer.CallbackParam}'
context_message += f"\n - Parameter input : {type(timezone)} : {timezone}"
# context_message += f'\ni Context : {ctx.params}'
context_message_alternative = f"[yellow]>[/yellow] Executing [underline]parser function[/underline] : parse_timezone()"
# context_message_alternative += f'\n[yellow]i[/yellow] Callback parameter : {typer.CallbackParam}'
context_message_alternative += (
f"\n - Parameter input : {type(timezone)} : {timezone}"
)
# context_message_alternative += f'\n[yellow]i[/yellow] Context: {ctx.params}'
if not timezone:
timezone = str()
elif timezone != "local":
timezone = ZoneInfo(timezone)
context_message += f"\n < Returning object : {type(timezone)} : {timezone}"
context_message_alternative += (
f"\n [green]<[/green] Returning object : {type(timezone)} : {timezone}"
)
logger.debug(context_message, alt=context_message_alternative)
return timezone
irradiance ¶
Modules:
| Name | Description |
|---|---|
diffuse | |
direct | |
effective | |
extraterrestrial | |
limits | |
reflectivity | |
shortwave | |
diffuse ¶
Modules:
| Name | Description |
|---|---|
altitude | |
clear_sky | |
ground_reflected | |
horizontal | |
inclined | |
altitude ¶
Functions:
| Name | Description |
|---|---|
calculate_diffuse_sky_irradiance_series | Calculate the diffuse sky irradiance |
calculate_diffuse_solar_altitude_coefficients_series | Calculate the diffuse solar altitude coefficients. |
calculate_diffuse_solar_altitude_function_series | Calculate the diffuse solar altitude |
calculate_diffuse_transmission_function_series | Diffuse transmission function over a period of time |
calculate_term_n_series | Define the N term for a period of time |
calculate_diffuse_sky_irradiance_series ¶
calculate_diffuse_sky_irradiance_series(
n_series: List[float],
surface_tilt: float | None = radians(45),
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
log: int = 0,
)
Calculate the diffuse sky irradiance
The diffuse sky irradiance function F(γN) depends on the surface tilt γN (in radians)
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
surface_tilt | float | None | The tilt (also referred to as : inclination or slope) angle of a solar surface | radians(45) |
n_series | List[float] | The term N | required |
Returns:
| Type | Description |
|---|---|
Notes | |
----- | |
Internally the function calculates first the dimensionless fraction of the | |
sky dome viewed by a tilted (or inclined) surface `ri(γN)`. | |
Source code in pvgisprototype/api/irradiance/diffuse/altitude.py
@log_function_call
def calculate_diffuse_sky_irradiance_series(
n_series: List[float],
surface_tilt: float | None = np.radians(45),
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
log: int = 0,
):
"""Calculate the diffuse sky irradiance
The diffuse sky irradiance function F(γN) depends on the surface tilt `γN`
(in radians)
Parameters
----------
surface_tilt: float (radians)
The tilt (also referred to as : inclination or slope) angle of a solar
surface
n_series: float
The term N
Returns
-------
Notes
-----
Internally the function calculates first the dimensionless fraction of the
sky dome viewed by a tilted (or inclined) surface `ri(γN)`.
"""
diffuse_sky_irradiance_series = calculate_diffuse_sky_irradiance_series_hofierka(
n_series=n_series,
surface_tilt=surface_tilt,
dtype=dtype,
array_backend=array_backend,
log=log,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return diffuse_sky_irradiance_series
calculate_diffuse_solar_altitude_coefficients_series ¶
calculate_diffuse_solar_altitude_coefficients_series(
linke_turbidity_factor_series,
verbose: int = 0,
log: int = 0,
)
Calculate the diffuse solar altitude coefficients.
Calculate the diffuse solar altitude coefficients over a period of time.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
linke_turbidity_factor_series | The Linke turbidity factors as a list of LinkeTurbidityFactor objects or a single object. | required |
Source code in pvgisprototype/api/irradiance/diffuse/altitude.py
@log_function_call
def calculate_diffuse_solar_altitude_coefficients_series(
linke_turbidity_factor_series,
verbose: int = 0,
log: int = 0,
):
"""Calculate the diffuse solar altitude coefficients.
Calculate the diffuse solar altitude coefficients over a period of time.
Parameters
----------
linke_turbidity_factor_series: (List[LinkeTurbidityFactor] or LinkeTurbidityFactor)
The Linke turbidity factors as a list of LinkeTurbidityFactor objects
or a single object.
Returns
-------
"""
a1_series, a2_series, a3_series = (
calculate_diffuse_solar_altitude_coefficients_series_hofierka(
linke_turbidity_factor_series=linke_turbidity_factor_series,
verbose=verbose,
log=log,
)
)
if verbose == DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return a1_series, a2_series, a3_series
calculate_diffuse_solar_altitude_function_series ¶
calculate_diffuse_solar_altitude_function_series(
solar_altitude_series: List[float],
linke_turbidity_factor_series: LinkeTurbidityFactor,
verbose: int = 0,
log: int = 0,
)
Calculate the diffuse solar altitude
Notes
Other symbol: function Fd
Source code in pvgisprototype/api/irradiance/diffuse/altitude.py
@log_function_call
def calculate_diffuse_solar_altitude_function_series(
solar_altitude_series: List[float],
linke_turbidity_factor_series: LinkeTurbidityFactor,
verbose: int = 0,
log: int = 0,
):
"""Calculate the diffuse solar altitude
Notes
-----
Other symbol: function Fd
"""
a1_series, a2_series, a3_series = (
calculate_diffuse_solar_altitude_function_series_hofierka(
solar_altitude_series=solar_altitude_series,
linke_turbidity_factor_series=linke_turbidity_factor_series,
verbose=verbose,
log=log,
)
)
if verbose == DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return (
a1_series
+ a2_series * np.sin(solar_altitude_series.radians)
+ a3_series * np.power(np.sin(solar_altitude_series.radians), 2)
)
calculate_diffuse_transmission_function_series ¶
calculate_diffuse_transmission_function_series(
linke_turbidity_factor_series,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = 0,
log: int = 0,
) -> array
Diffuse transmission function over a period of time
Notes
From r.pv's source code:
tn = -0.015843 + locLinke * (0.030543 + 0.0003797 * locLinke);
From Hofierka (2002) :
The estimate of the transmission function Tn(TLK) gives a theoretical
diffuse irradiance on a horizontal surface with the sun vertically
overhead for the air mass 2 Linke turbidity factor. The following
second order polynomial expression is used:
Tn(TLK) = -0.015843 + 0.030543 TLK + 0.0003797 TLK^2
Source code in pvgisprototype/api/irradiance/diffuse/altitude.py
@log_function_call
def calculate_diffuse_transmission_function_series(
linke_turbidity_factor_series,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = 0,
log: int = 0,
) -> np.array:
"""Diffuse transmission function over a period of time
Notes
-----
From r.pv's source code:
tn = -0.015843 + locLinke * (0.030543 + 0.0003797 * locLinke);
From Hofierka (2002) :
The estimate of the transmission function Tn(TLK) gives a theoretical
diffuse irradiance on a horizontal surface with the sun vertically
overhead for the air mass 2 Linke turbidity factor. The following
second order polynomial expression is used:
Tn(TLK) = -0.015843 + 0.030543 TLK + 0.0003797 TLK^2
"""
diffuse_transmission_series = calculate_diffuse_transmission_function_series_hofierka(
linke_turbidity_factor_series=linke_turbidity_factor_series,
dtype=dtype,
array_backend=array_backend,
log=log,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return diffuse_transmission_series
calculate_term_n_series ¶
calculate_term_n_series(
kb_series: List[float],
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = 0,
log: int = 0,
)
Define the N term for a period of time
N = 0.00263 − 0.712 × kb − 0.6883 × kb2
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
kb_series | List[float] | Direct to extraterrestrial irradiance ratio | required |
Returns:
| Name | Type | Description |
|---|---|---|
N | float | The N term |
Source code in pvgisprototype/api/irradiance/diffuse/altitude.py
@log_function_call
def calculate_term_n_series(
kb_series: List[float],
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = 0,
log: int = 0,
):
"""Define the N term for a period of time
N = 0.00263 − 0.712 × kb − 0.6883 × kb2
Parameters
----------
kb_series: float
Direct to extraterrestrial irradiance ratio
Returns
-------
N: float
The N term
"""
term_n_series = calculate_term_n_series_hofierka(
kb_series=kb_series,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
log_data_fingerprint(
data=term_n_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return term_n_series
clear_sky ¶
Modules:
| Name | Description |
|---|---|
horizontal | |
horizontal ¶
Functions:
| Name | Description |
|---|---|
calculate_clear_sky_diffuse_horizontal_irradiance | |
calculate_clear_sky_diffuse_horizontal_irradiance ¶
calculate_clear_sky_diffuse_horizontal_irradiance(
longitude: float,
latitude: float,
timestamps: DatetimeIndex = DatetimeIndex(
[now(tz="UTC")]
),
timezone: ZoneInfo = ZoneInfo("UTC"),
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
)
Source code in pvgisprototype/api/irradiance/diffuse/clear_sky/horizontal.py
@log_function_call
def calculate_clear_sky_diffuse_horizontal_irradiance(
longitude: float,
latitude: float,
timestamps: DatetimeIndex = DatetimeIndex([Timestamp.now(tz='UTC')]),
timezone: ZoneInfo = ZoneInfo('UTC'),
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
# unrefracted_solar_zenith: UnrefractedSolarZenith | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT, # radians
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
# angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
):
"""
"""
# solar altitude : required by
# `calculate_diffuse_horizontal_irradiance_hofierka()`
# to calculate the extraterrestrial irradiance on a horizontal surface
solar_altitude_series = model_solar_altitude_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
# solar_time_model=solar_time_model,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
diffuse_horizontal_irradiance_series = (
calculate_clear_sky_diffuse_horizontal_irradiance_hofierka(
timestamps=timestamps,
linke_turbidity_factor_series=linke_turbidity_factor_series,
solar_altitude_series=solar_altitude_series,
solar_constant=solar_constant,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
)
diffuse_horizontal_irradiance_series.build_output(
verbose=verbose, fingerprint=fingerprint
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=diffuse_horizontal_irradiance_series.value,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return diffuse_horizontal_irradiance_series
ground_reflected ¶
Functions:
| Name | Description |
|---|---|
calculate_ground_reflected_inclined_irradiance_series | Calculate the clear-sky diffuse ground reflected irradiance on an inclined surface (Ri). |
calculate_ground_reflected_inclined_irradiance_series ¶
calculate_ground_reflected_inclined_irradiance_series(
longitude: float,
latitude: float,
elevation: float,
timestamps: DatetimeIndex = DatetimeIndex(
[now(tz="UTC")]
),
timezone: ZoneInfo = ZoneInfo("UTC"),
surface_orientation: float = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: float = SURFACE_TILT_DEFAULT,
surface_tilt_threshold=SURFACE_TILT_HORIZONTALLY_FLAT_PANEL_THRESHOLD,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
albedo: float | None = ALBEDO_DEFAULT,
global_horizontal_irradiance: ndarray | None = None,
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = noaa,
solar_time_model: SolarTimeModel = noaa,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> DiffuseGroundReflectedInclinedIrradiance
Calculate the clear-sky diffuse ground reflected irradiance on an inclined surface (Ri).
The calculation relies on an isotropic assumption. The ground reflected clear-sky irradiance received on an inclined surface [W.m-2] is proportional to the global horizontal irradiance Ghc, to the mean ground albedo ρg and a fraction of the ground viewed by an inclined surface rg(γN).
Source code in pvgisprototype/api/irradiance/diffuse/ground_reflected.py
@log_function_call
@custom_cached
def calculate_ground_reflected_inclined_irradiance_series(
longitude: float,
latitude: float,
elevation: float,
timestamps: DatetimeIndex = DatetimeIndex([Timestamp.now(tz='UTC')]),
timezone: ZoneInfo = ZoneInfo('UTC'),
surface_orientation: float = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: float = SURFACE_TILT_DEFAULT,
surface_tilt_threshold = SURFACE_TILT_HORIZONTALLY_FLAT_PANEL_THRESHOLD,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
# refracted_solar_zenith: (
# float | None
# ) = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT, # radians
albedo: float | None = ALBEDO_DEFAULT,
global_horizontal_irradiance: ndarray | None = None,
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SolarPositionModel.noaa,
solar_time_model: SolarTimeModel = SolarTimeModel.noaa,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> DiffuseGroundReflectedInclinedIrradiance:
"""Calculate the clear-sky diffuse ground reflected irradiance on an inclined surface (Ri).
The calculation relies on an isotropic assumption. The ground reflected
clear-sky irradiance received on an inclined surface [W.m-2] is
proportional to the global horizontal irradiance Ghc, to the mean ground
albedo ρg and a fraction of the ground viewed by an inclined surface
rg(γN).
"""
ground_reflected_inclined_irradiance_series = (
DiffuseGroundReflectedInclinedIrradiance(ground_view_fraction=0)
)
if surface_tilt != 0: # there is no horizontal diffuse ground-reflected irradiance
# if global horizontal irradiance is read from external time series as an array
if isinstance(global_horizontal_irradiance, ndarray):
ground_reflected_inclined_irradiance_series = (
calculate_ground_reflected_inclined_irradiance_series_pvgis(
longitude=longitude,
latitude=latitude,
elevation=elevation,
timestamps=timestamps,
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
surface_tilt_threshold=surface_tilt_threshold,
albedo=albedo,
global_horizontal_irradiance=global_horizontal_irradiance,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
)
else:
# simulate clear-sky direct & diffuse sky-reflected components
ground_reflected_inclined_irradiance_series = calculate_clear_sky_ground_reflected_inclined_irradiance_series_pvgis(
longitude=longitude,
latitude=latitude,
elevation=elevation,
timestamps=timestamps,
timezone=timezone,
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
surface_tilt_threshold=surface_tilt_threshold,
linke_turbidity_factor_series=linke_turbidity_factor_series,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
albedo=albedo,
solar_position_model=solar_position_model,
solar_time_model=solar_time_model,
solar_constant=solar_constant,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if apply_reflectivity_factor:
ground_reflected_inclined_irradiance_series = apply_reflectivity_factor_for_nondirect_irradiance(
ground_reflected_inclined_irradiance_series=ground_reflected_inclined_irradiance_series,
surface_tilt=surface_tilt,
dtype=dtype,
array_backend=array_backend,
)
else:
array_parameters = {
"shape": timestamps.shape,
"dtype": dtype,
"init_method": "zeros",
"backend": array_backend,
} # Borrow shape from timestamps
zero_array = create_array(**array_parameters)
ground_reflected_inclined_irradiance_series.reflectivity = zero_array
ground_reflected_inclined_irradiance_series.value_before_reflectivity = zero_array
ground_reflected_inclined_irradiance_series.reflectivity_factor = zero_array
ground_reflected_inclined_irradiance_series.build_output(verbose, fingerprint)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=ground_reflected_inclined_irradiance_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return ground_reflected_inclined_irradiance_series
horizontal ¶
Functions:
| Name | Description |
|---|---|
calculate_diffuse_horizontal_irradiance_from_external_data | Calculate the diffuse horizontal irradiance from SARAH time series. |
calculate_diffuse_horizontal_irradiance_from_external_data ¶
calculate_diffuse_horizontal_irradiance_from_external_data(
global_horizontal_irradiance_series,
direct_horizontal_irradiance_series,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
fingerprint: bool = False,
)
Calculate the diffuse horizontal irradiance from SARAH time series.
Calculate the diffuse horizontal irradiance incident on a solar surface from SARAH time series.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
shortwave | Filename of surface short-wave (solar) radiation downwards time series (short name : | required |
Returns:
| Name | Type | Description |
|---|---|---|
diffuse_irradiance | float | The diffuse radiant flux incident on a surface per unit area in W/m². |
Source code in pvgisprototype/api/irradiance/diffuse/horizontal.py
@log_function_call
def calculate_diffuse_horizontal_irradiance_from_external_data(
global_horizontal_irradiance_series,
direct_horizontal_irradiance_series,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT, # Not yet integrated !
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
fingerprint: bool = False,
):
"""Calculate the diffuse horizontal irradiance from SARAH time series.
Calculate the diffuse horizontal irradiance incident on a solar surface
from SARAH time series.
Parameters
----------
shortwave: Path
Filename of surface short-wave (solar) radiation downwards time series
(short name : `ssrd`) from ECMWF which is the solar radiation that
reaches a horizontal plane at the surface of the Earth. This parameter
comprises both direct and diffuse solar radiation.
Returns
-------
diffuse_irradiance: float
The diffuse radiant flux incident on a surface per unit area in W/m².
"""
diffuse_horizontal_irradiance_series = (
global_horizontal_irradiance_series - direct_horizontal_irradiance_series
).astype(dtype=dtype)
if diffuse_horizontal_irradiance_series.size == 1:
single_value = float(diffuse_horizontal_irradiance_series)
warning = (
f"{exclamation_mark} The selected timestamp "
+ " matches the single value "
+ f"{single_value}"
)
logger.warning(warning)
out_of_range, out_of_range_index = identify_values_out_of_range(
series=diffuse_horizontal_irradiance_series,
# shape=timestamps.shape,
shape=global_horizontal_irradiance_series.shape,
data_model=DiffuseSkyReflectedHorizontalIrradianceFromExternalData(),
)
diffuse_horizontal_irradiance_series = DiffuseSkyReflectedHorizontalIrradianceFromExternalData(
value=diffuse_horizontal_irradiance_series,
out_of_range=out_of_range,
out_of_range_index=out_of_range_index,
global_horizontal_irradiance=global_horizontal_irradiance_series,
direct_horizontal_irradiance=direct_horizontal_irradiance_series,
)
diffuse_horizontal_irradiance_series.build_output(
verbose=verbose, fingerprint=fingerprint
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=diffuse_horizontal_irradiance_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return diffuse_horizontal_irradiance_series
inclined ¶
Functions:
| Name | Description |
|---|---|
calculate_diffuse_inclined_irradiance | Calculate the diffuse irradiance incident on a solar surface. |
calculate_diffuse_inclined_irradiance ¶
calculate_diffuse_inclined_irradiance(
longitude: float,
latitude: float,
elevation: float,
surface_orientation: SurfaceOrientation = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt = SURFACE_TILT_DEFAULT,
timestamps: DatetimeIndex | None = DatetimeIndex(
[now(tz="UTC")]
),
timezone: ZoneInfo | None = None,
global_horizontal_irradiance: ndarray | None = None,
direct_horizontal_irradiance: ndarray | None = None,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
sun_horizon_position: List[
SunHorizonPositionModel
] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_model: SolarIncidenceModel = SOLAR_INCIDENCE_ALGORITHM_DEFAULT,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
shading_states: List[ShadingState] = [all],
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> DiffuseSkyReflectedInclinedIrradiance
Calculate the diffuse irradiance incident on a solar surface.
Notes
In order or appearance:
- extraterrestrial_normal_irradiance : G0
- extraterrestrial_horizontal_irradiance : G0h = G0 sin(h0)
- kb : Proportion between direct (beam) and extraterrestrial irradiance : Kb
- diffuse_horizontal_component : Dhc [W.m-2]
- diffuse_transmission_function() :
- linke_turbidity_factor :
- diffuse_solar_altitude_function() :
- solar_altitude :
- calculate_term_n():
- n : the N term
- diffuse_sky_irradiance()
- sine_solar_incidence_angle
- sine_solar_altitude
- diffuse_sky_irradiance
- calculate_diffuse_sky_irradiance() : F(γN)
- surface_tilt :
- diffuse_inclined_irradiance :
- diffuse_horizontal_component :
- azimuth_difference :
- solar_azimuth :
- surface_orientation :
- diffuse_irradiance
Source code in pvgisprototype/api/irradiance/diffuse/inclined.py
@log_function_call
def calculate_diffuse_inclined_irradiance(
longitude: float,
latitude: float,
elevation: float,
surface_orientation: SurfaceOrientation = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt = SURFACE_TILT_DEFAULT,
timestamps: DatetimeIndex | None = DatetimeIndex([Timestamp.now(tz='UTC')]),
timezone: ZoneInfo | None = None,
global_horizontal_irradiance: ndarray | None = None,
direct_horizontal_irradiance: ndarray | None = None,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
# refracted_solar_zenith: (
# float | None
# ) = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT, # radians
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
sun_horizon_position: List[SunHorizonPositionModel] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_model: SolarIncidenceModel = SOLAR_INCIDENCE_ALGORITHM_DEFAULT,
# complementary_incidence_angle: bool = COMPLEMENTARY_INCIDENCE_ANGLE_DEFAULT, # Let Me Hardcoded, Read the docstring!
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
shading_states: List[ShadingState] = [ShadingState.all],
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output:bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> DiffuseSkyReflectedInclinedIrradiance:
"""Calculate the diffuse irradiance incident on a solar surface.
Notes
-----
In order or appearance:
- extraterrestrial_normal_irradiance : G0
- extraterrestrial_horizontal_irradiance : G0h = G0 sin(h0)
- kb : Proportion between direct (beam) and extraterrestrial irradiance : Kb
- diffuse_horizontal_component : Dhc [W.m-2]
- diffuse_transmission_function() :
- linke_turbidity_factor :
- diffuse_solar_altitude_function() :
- solar_altitude :
- calculate_term_n():
- n : the N term
- diffuse_sky_irradiance()
- sine_solar_incidence_angle
- sine_solar_altitude
- diffuse_sky_irradiance
- calculate_diffuse_sky_irradiance() : F(γN)
- surface_tilt :
- diffuse_inclined_irradiance :
- diffuse_horizontal_component :
- azimuth_difference :
- solar_azimuth :
- surface_orientation :
- diffuse_irradiance
"""
# build reusable parameter dictionaries
coordinates = {
'longitude': longitude,
'latitude': latitude,
}
time = {
'timestamps': timestamps,
'timezone': timezone,
}
solar_positioning = {
'solar_position_model': solar_position_model,
'adjust_for_atmospheric_refraction': adjust_for_atmospheric_refraction,
'solar_time_model': solar_time_model,
}
surface_position = {
'surface_orientation': surface_orientation,
'surface_tilt': surface_tilt,
}
earth_orbit = {
'eccentricity_phase_offset': eccentricity_phase_offset,
'eccentricity_amplitude': eccentricity_amplitude,
}
array_parameters = {
"dtype": dtype,
"array_backend": array_backend,
}
output_parameters = {
'verbose': verbose, # Is this wanted here ? i.e. not setting = 0 ?
'log': log,
}
# Some quantities are not always required, hence set them to avoid UnboundLocalError!
extended_array_parameters = {
"shape": timestamps.shape,
"dtype": dtype,
"backend": array_backend,
"init_method": "zeros",
} # Borrow shape from timestamps
solar_azimuth_series = SolarAzimuth(value=create_array(**extended_array_parameters))
solar_incidence_series = create_array(**extended_array_parameters)
# Calculate quantities required : ---------------------------- >>> >>> >>>
# 1. to model the diffuse horizontal irradiance [optional]
# 2. to calculate the diffuse sky ... to consider shaded, sunlit and potentially sunlit surfaces
#
# extraterrestrial on a horizontal surface requires the solar altitude
solar_altitude_series = model_solar_altitude_series(
**coordinates,
**time,
**solar_positioning,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
**earth_orbit,
**array_parameters,
**output_parameters,
)
# Calculate quantities required : ---------------------------- <<< <<< <<<
if surface_tilt > SURFACE_TILT_HORIZONTALLY_FLAT_PANEL_THRESHOLD: # tilted (or inclined) surface
# requires the solar incidence angle for shading and times of sunlit surface
solar_incidence_series = model_solar_incidence_series(
**coordinates,
**time,
**surface_position,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
solar_incidence_model=solar_incidence_model,
horizon_profile=horizon_profile,
shading_model=shading_model,
complementary_incidence_angle=True, # True = between sun-vector and surface-plane !
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
**earth_orbit,
**array_parameters,
validate_output=validate_output,
**output_parameters,
)
# Potentially sunlit surface series : solar altitude < 0.1 radians (or < 5.7 degrees)
if np.any(solar_altitude_series.radians < 0.1): # requires the solar azimuth
solar_azimuth_series = model_solar_azimuth_series(
**coordinates,
**time,
**solar_positioning,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
**earth_orbit,
verbose=verbose,
)
surface_in_shade_series = model_surface_in_shade_series(
horizon_profile=horizon_profile,
**coordinates,
**time,
**solar_positioning,
shading_model=shading_model,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
**earth_orbit,
**array_parameters,
validate_output=validate_output,
**output_parameters,
)
common_input_parameters = {
**surface_position,
**time,
'global_horizontal_irradiance_series': global_horizontal_irradiance,
'direct_horizontal_irradiance_series': direct_horizontal_irradiance,
'apply_reflectivity_factor': apply_reflectivity_factor,
'solar_altitude_series': solar_altitude_series,
'solar_azimuth_series': solar_azimuth_series,
'solar_incidence_series': solar_incidence_series,
'surface_in_shade_series': surface_in_shade_series,
'shading_states': shading_states,
'solar_constant': solar_constant,
**earth_orbit,
**array_parameters,
**output_parameters,
}
# if solar irradiance time series read from external data
if isinstance(global_horizontal_irradiance, ndarray) and isinstance(
direct_horizontal_irradiance, (ndarray, DirectHorizontalIrradiance)
):
diffuse_inclined_irradiance_series = (
calculate_diffuse_inclined_irradiance_muneer(
**common_input_parameters,
)
)
else: # simulate the clear-sky index
diffuse_inclined_irradiance_series = (
calculate_clear_sky_diffuse_inclined_irradiance_muneer(
elevation=elevation,
linke_turbidity_factor_series=linke_turbidity_factor_series,
**common_input_parameters,
)
)
# ==========================================================================
# Following do not affect calculations, yet important are they for the output !
# Perhaps find a way to "hide" them ?
diffuse_inclined_irradiance_series.angle_output_units = angle_output_units
diffuse_inclined_irradiance_series.solar_altitude = getattr(
solar_altitude_series, angle_output_units
)
# ==========================================================================
diffuse_inclined_irradiance_series.reflected_percentage = (
calculate_reflectivity_effect_percentage(
irradiance=diffuse_inclined_irradiance_series.value_before_reflectivity,
reflectivity=diffuse_inclined_irradiance_series.reflectivity_factor,
)
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
diffuse_inclined_irradiance_series.build_output(verbose, fingerprint)
log_data_fingerprint(
data=diffuse_inclined_irradiance_series.value,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return diffuse_inclined_irradiance_series
direct ¶
Modules:
| Name | Description |
|---|---|
helpers | |
horizontal | This Python module is part of PVGIS' API. It implements functions to calculate |
inclined | This Python module is part of PVGIS' API. It implements functions to calculate |
normal | This Python module is part of PVGIS' API. It implements functions to calculate |
normal_from_horizontal | This Python module is part of PVGIS' API. It implements functions to calculate |
optical_air_mass | |
rayleigh_optical_thickness | This Python module is part of PVGIS' API. It implements functions to calculate |
refraction | This Python module is part of PVGIS' API. It implements functions to calculate |
helpers ¶
Functions:
| Name | Description |
|---|---|
compare_temporal_resolution | Check if the frequency of |
compare_temporal_resolution ¶
compare_temporal_resolution(
timestamps: DatetimeIndex = None,
array: ndarray = None,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
) -> None
Check if the frequency of timestamps matches the temporal resolution of the array.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
timestamps | DatetimeIndex | An array of generated timestamps. | None |
array | ndarray | An array of data corresponding to some time series. | None |
Raises:
| Type | Description |
|---|---|
ValueError: If the lengths of the timestamps and data_series don't match. | |
Source code in pvgisprototype/api/irradiance/direct/helpers.py
@log_function_call
def compare_temporal_resolution(
timestamps: DatetimeIndex = None,
array: ndarray = None,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
) -> None:
"""
Check if the frequency of `timestamps` matches the temporal resolution of the `array`.
Parameters
----------
timestamps:
An array of generated timestamps.
array:
An array of data corresponding to some time series.
Raises
------
ValueError: If the lengths of the timestamps and data_series don't match.
"""
if timestamps.size != array.size:
raise ValueError(
f"The frequency of `timestamps` ({timestamps.size}) does not match the temporal resolution of the `array` ({array.size}). Please ensure they have the same temporal resolution."
)
horizontal ¶
This Python module is part of PVGIS' API. It implements functions to calculate the direct solar irradiance.
Direct or beam irradiance is one of the main components of solar irradiance. It comes perpendicular from the Sun and is not scattered before it irradiates a surface.
During a cloudy day the sunlight will be partially absorbed and scattered by different air molecules. The latter part is defined as the diffuse irradiance. The remaining part is the direct irradiance.
Functions:
| Name | Description |
|---|---|
calculate_clear_sky_direct_horizontal_irradiance_series | Calculate the direct horizontal irradiance |
calculate_clear_sky_direct_horizontal_irradiance_series ¶
calculate_clear_sky_direct_horizontal_irradiance_series(
longitude: float,
latitude: float,
elevation: float,
timestamps: DatetimeIndex | None = None,
timezone: ZoneInfo | None = None,
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> ndarray
Calculate the direct horizontal irradiance
This function implements the algorithm described by Hofierka [1]_.
Notes
Known also as : SID, units : W*m-2
References
.. [1] Hofierka, J. (2002). Some title of the paper. Journal Name, vol(issue), pages.
Source code in pvgisprototype/api/irradiance/direct/horizontal.py
@log_function_call
@custom_cached
def calculate_clear_sky_direct_horizontal_irradiance_series(
longitude: float,
latitude: float,
elevation: float,
timestamps: DatetimeIndex | None = None,
timezone: ZoneInfo | None = None,
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
# unrefracted_solar_zenith: UnrefractedSolarZenith | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
# angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> np.ndarray:
"""Calculate the direct horizontal irradiance
This function implements the algorithm described by Hofierka [1]_.
Notes
-----
Known also as : SID, units : W*m-2
References
----------
.. [1] Hofierka, J. (2002). Some title of the paper. Journal Name, vol(issue), pages.
"""
# build reusable parameter dictionaries
coordinates = {
'longitude': longitude,
'latitude': latitude,
}
time = {
'timestamps': timestamps,
'timezone': timezone,
}
solar_positioning = {
'solar_position_model': solar_position_model,
'adjust_for_atmospheric_refraction': adjust_for_atmospheric_refraction,
'solar_time_model': solar_time_model,
}
earth_orbit = {
'eccentricity_phase_offset': eccentricity_phase_offset,
'eccentricity_amplitude': eccentricity_amplitude,
}
array_parameters = {
"dtype": dtype,
"array_backend": array_backend,
}
output_parameters = {
'verbose': verbose, # Is this wanted here ? i.e. not setting = 0 ?
'log': log,
}
solar_time_model = validate_model(
SolarTimeModel, solar_time_model
) # can be only one of!
solar_altitude_series = model_solar_altitude_series(
**coordinates,
**time,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
# solar_timing_model=solar_time_model,
**earth_orbit,
**array_parameters,
verbose=verbose,
)
surface_in_shade_series = model_surface_in_shade_series(
horizon_profile=horizon_profile,
**coordinates,
**time,
**solar_positioning,
shading_model=shading_model,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
**earth_orbit,
**array_parameters,
validate_output=validate_output,
**output_parameters,
)
direct_horizontal_irradiance_series = (
calculate_clear_sky_direct_horizontal_irradiance_hofierka(
elevation=elevation,
timestamps=timestamps,
solar_altitude_series=solar_altitude_series,
surface_in_shade_series=surface_in_shade_series,
linke_turbidity_factor_series=linke_turbidity_factor_series,
solar_constant=solar_constant,
**earth_orbit,
**array_parameters,
**output_parameters,
)
)
direct_horizontal_irradiance_series.build_output(verbose, fingerprint)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=direct_horizontal_irradiance_series.value,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return direct_horizontal_irradiance_series
inclined ¶
This Python module is part of PVGIS' API. It implements functions to calculate the direct inclined solar irradiance.
Direct or beam irradiance is one of the main components of solar irradiance. It comes perpendicular from the Sun and is not scattered before it irradiates a surface.
During a cloudy day the sunlight will be partially absorbed and scattered by different air molecules. The latter part is defined as the diffuse irradiance. The remaining part is the direct irradiance.
Functions:
| Name | Description |
|---|---|
calculate_direct_inclined_irradiance | Calculate the direct irradiance incident on a tilted surface [W*m-2]. |
calculate_direct_inclined_irradiance ¶
calculate_direct_inclined_irradiance(
longitude: float,
latitude: float,
elevation: float,
surface_orientation: (
SurfaceOrientation | None
) = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt | None = SURFACE_TILT_DEFAULT,
timestamps: DatetimeIndex = str(now_utc_datetimezone()),
timezone: ZoneInfo | None = None,
direct_horizontal_irradiance: ndarray | None = None,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = True,
apply_reflectivity_factor: bool = True,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
solar_incidence_model: SolarIncidenceModel = SOLAR_INCIDENCE_ALGORITHM_DEFAULT,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> DirectInclinedIrradianceFromExternalData
Calculate the direct irradiance incident on a tilted surface [W*m-2].
Calculate the direct irradiance on an inclined surface based on the solar radiation model by Hofierka, 2002. [1]_
Notes
Bic = B0c sin δexp (equation 11)
or
B ⋅ sin ⎛δ ⎞
hc ⎝ exp⎠ ⎛ W ⎞
B = ──────────────── in ⎜───⎟ ic sin ⎛h ⎞ ⎜ -2⎟ ⎝ 0⎠ ⎝m ⎠
(equation 12)
where :
- δexp is the solar incidence angle measured between the sun and an inclined surface defined in equation (16).
or else :
Direct Inclined = Direct Horizontal * sin( Solar Incidence ) / sin( Solar Altitude )
The implementation by Hofierka (2002) uses the solar incidence angle between the sun-vector and the plane of the reference surface (as per Jenčo, 1992). This is very important and relates to the hardcoded value True for the complementary_incidence_angle input parameter of the function. We call this angle (definition) the complementary incidence angle.
For the losses due to reflectivity, the incidence angle modifier by Martin & Ruiz (2005) expects the incidence angle between the sun-vector and the surface-normal. Hence, the respective call of the function calculate_reflectivity_factor_for_direct_irradiance_series(), expects the complement of the angle defined by Jenčo (1992). We call the incidence angle expected by the incidence angle modifier by Martin & Ruiz (2005) the typical incidence angle.
See also the documentation of the function calculate_solar_incidence_series_jenco().
References
.. [1] Hofierka, J. (2002). Some title of the paper. Journal Name, vol(issue), pages.
Source code in pvgisprototype/api/irradiance/direct/inclined.py
@log_function_call
@custom_cached
def calculate_direct_inclined_irradiance(
longitude: float,
latitude: float,
elevation: float,
surface_orientation: SurfaceOrientation | None = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt | None = SURFACE_TILT_DEFAULT,
timestamps: DatetimeIndex = str(now_utc_datetimezone()),
timezone: ZoneInfo | None = None,
# convert_longitude_360: bool = False,
direct_horizontal_irradiance: ndarray | None = None,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = True,
# refracted_solar_zenith: (
# float | None
# ) = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT, # radians
apply_reflectivity_factor: bool = True,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
solar_incidence_model: SolarIncidenceModel = SOLAR_INCIDENCE_ALGORITHM_DEFAULT,
# complementary_incidence_angle: bool = COMPLEMENTARY_INCIDENCE_ANGLE_DEFAULT,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> DirectInclinedIrradianceFromExternalData:
"""Calculate the direct irradiance incident on a tilted surface [W*m-2].
Calculate the direct irradiance on an inclined surface based on the
solar radiation model by Hofierka, 2002. [1]_
Notes
-----
Bic = B0c sin δexp (equation 11)
or
B ⋅ sin ⎛δ ⎞
hc ⎝ exp⎠ ⎛ W ⎞
B = ──────────────── in ⎜───⎟
ic sin ⎛h ⎞ ⎜ -2⎟
⎝ 0⎠ ⎝m ⎠
(equation 12)
where :
- δexp is the solar incidence angle measured between the sun and an
inclined surface defined in equation (16).
or else :
Direct Inclined = Direct Horizontal * sin( Solar Incidence ) / sin( Solar Altitude )
The implementation by Hofierka (2002) uses the solar incidence angle
between the sun-vector and the plane of the reference surface (as per Jenčo,
1992). This is very important and relates to the hardcoded value `True` for
the `complementary_incidence_angle` input parameter of the function. We
call this angle (definition) the _complementary_ incidence angle.
For the losses due to reflectivity, the incidence angle modifier by Martin
& Ruiz (2005) expects the incidence angle between the sun-vector and the
surface-normal. Hence, the respective call of the function
`calculate_reflectivity_factor_for_direct_irradiance_series()`,
expects the complement of the angle defined by Jenčo (1992). We call the
incidence angle expected by the incidence angle modifier by Martin & Ruiz
(2005) the _typical_ incidence angle.
See also the documentation of the function
`calculate_solar_incidence_series_jenco()`.
References
----------
.. [1] Hofierka, J. (2002). Some title of the paper. Journal Name, vol(issue), pages.
"""
solar_incidence_series = model_solar_incidence_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_incidence_model=solar_incidence_model,
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
complementary_incidence_angle=True, # = Sun-vector To Surface-plane (Jenčo, 1992) !
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
dtype=dtype,
array_backend=array_backend,
verbose=0,
log=log,
)
solar_altitude_series = model_solar_altitude_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
# solar_time_model=solar_time_model,
# eccentricity_phase_offset=eccentricity_phase_offset,
# eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=0,
log=log,
)
# solar_azimuth_series = model_solar_azimuth_series(
# longitude=longitude,
# latitude=latitude,
# timestamps=timestamps,
# timezone=timezone,
# solar_position_model=solar_position_model,
# adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# # unrefracted_solar_zenith=unrefracted_solar_zenith,
# # solar_time_model=solar_time_model,
# # eccentricity_phase_offset=eccentricity_phase_offset,
# # eccentricity_amplitude=eccentricity_amplitude,
# dtype=dtype,
# array_backend=array_backend,
# verbose=0,
# log=log,
# )
surface_in_shade_series = model_surface_in_shade_series(
horizon_profile=horizon_profile,
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_time_model=solar_time_model,
solar_position_model=solar_position_model,
shading_model=shading_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
if isinstance(direct_horizontal_irradiance, ndarray):
direct_inclined_irradiance_series = (
calculate_direct_inclined_irradiance_hofierka(
timestamps=timestamps,
timezone=timezone,
direct_horizontal_irradiance=direct_horizontal_irradiance, # FixMe
apply_reflectivity_factor=apply_reflectivity_factor,
solar_incidence_series=solar_incidence_series,
solar_altitude_series=solar_altitude_series,
# solar_azimuth_series=solar_azimuth_series,
surface_in_shade_series=surface_in_shade_series,
dtype=dtype,
array_backend=array_backend,
validate_output=validate_output,
verbose=verbose,
log=log,
)
)
else:
direct_inclined_irradiance_series = (
calculate_clear_sky_direct_inclined_irradiance_hofierka(
elevation=elevation,
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
timestamps=timestamps,
timezone=timezone,
direct_horizontal_irradiance=direct_horizontal_irradiance, # FixMe
linke_turbidity_factor_series=linke_turbidity_factor_series,
apply_reflectivity_factor=apply_reflectivity_factor,
solar_incidence_series=solar_incidence_series,
solar_altitude_series=solar_altitude_series,
# solar_azimuth_series=solar_azimuth_series,
surface_in_shade_series=surface_in_shade_series,
solar_constant=solar_constant,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
validate_output=validate_output,
verbose=verbose,
log=log,
)
)
direct_inclined_irradiance_series.reflected = calculate_reflectivity_effect(
irradiance=direct_inclined_irradiance_series.value_before_reflectivity,
reflectivity_factor=direct_inclined_irradiance_series.reflectivity_factor,
)
direct_inclined_irradiance_series.reflected_percentage = calculate_reflectivity_effect_percentage(
irradiance=direct_inclined_irradiance_series.value_before_reflectivity,
reflectivity=direct_inclined_irradiance_series.reflectivity_factor,
)
# Angle output units -- Hide it if you can :-)
direct_inclined_irradiance_series.angle_output_units = angle_output_units
direct_inclined_irradiance_series.surface_orientation = (
convert_float_to_degrees_if_requested(
surface_orientation,
angle_output_units,
)
)
direct_inclined_irradiance_series.surface_tilt = (
convert_float_to_degrees_if_requested(surface_tilt, angle_output_units)
)
# direct_inclined_irradiance_series.solar_incidence = getattr(
# solar_incidence_series, angle_output_units
# )
# direct_inclined_irradiance_series.solar_azimuth = getattr(
# solar_azimuth_series, angle_output_units
# )
direct_inclined_irradiance_series.solar_altitude = getattr(
solar_altitude_series, angle_output_units
)
# Build the structured output
direct_inclined_irradiance_series.build_output(verbose, fingerprint)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=direct_inclined_irradiance_series.value,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return direct_inclined_irradiance_series
normal ¶
This Python module is part of PVGIS' API. It implements functions to calculate the direct solar irradiance.
Direct or beam irradiance is one of the main components of solar irradiance. It comes perpendicular from the Sun and is not scattered before it irradiates a surface.
During a cloudy day the sunlight will be partially absorbed and scattered by different air molecules. The latter part is defined as the diffuse irradiance. The remaining part is the direct irradiance.
Functions:
| Name | Description |
|---|---|
calculate_direct_normal_irradiance_series | Calculate the direct normal irradiance. |
calculate_direct_normal_irradiance_series ¶
calculate_direct_normal_irradiance_series(
timestamps: DatetimeIndex | None,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
optical_air_mass_series: OpticalAirMass = [
OPTICAL_AIR_MASS_TIME_SERIES_DEFAULT
],
clip_to_physically_possible_limits: bool = True,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> DirectNormalIrradiance
Calculate the direct normal irradiance.
The direct normal irradiance represents the amount of solar radiation received per unit area by a surface that is perpendicular (normal) to the rays that come in a straight line from the direction of the sun at its current position in the sky.
This function implements the algorithm described by Hofierka, 2002. [1]_
Notes
Known also as : SID, units : W*m-2
References
.. [1] Hofierka, J. (2002). Some title of the paper. Journal Name, vol(issue), pages.
Source code in pvgisprototype/api/irradiance/direct/normal.py
@log_function_call
@custom_cached
def calculate_direct_normal_irradiance_series(
timestamps: DatetimeIndex | None,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
optical_air_mass_series: OpticalAirMass = [
OPTICAL_AIR_MASS_TIME_SERIES_DEFAULT
], # REVIEW-ME + ?
clip_to_physically_possible_limits: bool = True,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> DirectNormalIrradiance:
"""Calculate the direct normal irradiance.
The direct normal irradiance represents the amount of solar radiation
received per unit area by a surface that is perpendicular (normal) to the
rays that come in a straight line from the direction of the sun at its
current position in the sky.
This function implements the algorithm described by Hofierka, 2002. [1]_
Notes
-----
Known also as : SID, units : W*m-2
References
----------
.. [1] Hofierka, J. (2002). Some title of the paper. Journal Name, vol(issue), pages.
"""
direct_normal_irradiance_series = calculate_direct_normal_irradiance_hofierka(
timestamps=timestamps,
linke_turbidity_factor_series=linke_turbidity_factor_series,
optical_air_mass_series=optical_air_mass_series,
clip_to_physically_possible_limits=clip_to_physically_possible_limits,
solar_constant=solar_constant,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
direct_normal_irradiance_series.build_output(verbose, fingerprint)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=direct_normal_irradiance_series.value, # on the array. Or do on the object ?
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return direct_normal_irradiance_series
normal_from_horizontal ¶
This Python module is part of PVGIS' API. It implements functions to calculate the direct normal solar irradiance.
Direct or beam irradiance is one of the main components of solar irradiance. It comes perpendicular from the Sun and is not scattered before it irradiates a surface.
During a cloudy day the sunlight will be partially absorbed and scattered by different air molecules. The latter part is defined as the diffuse irradiance. The remaining part is the direct irradiance.
Functions:
| Name | Description |
|---|---|
calculate_direct_normal_from_horizontal_irradiance_series | Calculate the direct normal from the horizontal irradiance. |
calculate_direct_normal_from_horizontal_irradiance_series ¶
calculate_direct_normal_from_horizontal_irradiance_series(
direct_horizontal_irradiance: ndarray,
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex | None = None,
timezone: ZoneInfo | None = None,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> DirectNormalFromHorizontalIrradiance
Calculate the direct normal from the horizontal irradiance.
The direct normal irradiance represents the amount of solar radiation received per unit area by a surface that is perpendicular (normal) to the rays that come in a straight line from the direction of the sun at its current position in the sky.
This function calculates the normal irradiance from the given horizontal irradiance component.
Notes
Known also as : SID, units : W*m-2
References
.. [1] Hofierka, J. (2002). Some title of the paper. Journal Name, vol(issue), pages.
Source code in pvgisprototype/api/irradiance/direct/normal_from_horizontal.py
@log_function_call
@custom_cached
def calculate_direct_normal_from_horizontal_irradiance_series(
direct_horizontal_irradiance: ndarray,
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex | None = None,
timezone: ZoneInfo | None = None,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> DirectNormalFromHorizontalIrradiance:
"""Calculate the direct normal from the horizontal irradiance.
The direct normal irradiance represents the amount of solar radiation
received per unit area by a surface that is perpendicular (normal) to the
rays that come in a straight line from the direction of the sun at its
current position in the sky.
This function calculates the normal irradiance from the given horizontal
irradiance component.
Notes
-----
Known also as : SID, units : W*m-2
References
----------
.. [1] Hofierka, J. (2002). Some title of the paper. Journal Name, vol(issue), pages.
"""
# FixMe : somehow let the angle_output_units requested by the user work !
solar_altitude_series = model_solar_altitude_series(
longitude=convert_float_to_radians_if_requested(longitude, RADIANS),
latitude=convert_float_to_radians_if_requested(latitude, RADIANS),
timestamps=timestamps,
timezone=timezone,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
)
direct_normal_irradiance_series = calculate_direct_normal_from_horizontal_irradiance_hofierka(
direct_horizontal_irradiance=direct_horizontal_irradiance,
solar_altitude_series=solar_altitude_series,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
fingerprint=fingerprint,
)
direct_normal_irradiance_series.build_output(verbose, fingerprint)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=direct_normal_irradiance_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return direct_normal_irradiance_series
optical_air_mass ¶
Functions:
| Name | Description |
|---|---|
adjust_elevation | Modifier component for the solar altitude as per Hofierka, 2002 |
calculate_optical_air_mass_series | Approximate the relative optical air mass. |
adjust_elevation ¶
adjust_elevation(
elevation: float,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
)
Modifier component for the solar altitude as per Hofierka, 2002
This function implements a modifier component for the solar altitude for the given elevation described by Hofierka, 2002 [1]_
Notes
In PVGIS C source code:
elevationCorr = exp(-sunVarGeom->z_orig / 8434.5);
References
.. [1] Hofierka, J. (2002). Some title of the paper. Journal Name, vol(issue), pages.
Source code in pvgisprototype/api/irradiance/direct/optical_air_mass.py
@log_function_call
@custom_cached
@validate_with_pydantic(AdjustElevationInputModel)
def adjust_elevation(
elevation: float,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
):
"""Modifier component for the solar altitude as per Hofierka, 2002
This function implements a modifier component for the solar altitude for
the given elevation described by Hofierka, 2002 [1]_
Notes
-----
In PVGIS C source code:
elevationCorr = exp(-sunVarGeom->z_orig / 8434.5);
References
----------
.. [1] Hofierka, J. (2002). Some title of the paper. Journal Name, vol(issue), pages.
"""
adjusted_elevation = np.array(np.exp(-elevation.value / 8434.5), dtype=dtype)
log_data_fingerprint(
data=adjusted_elevation,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return Elevation(value=adjusted_elevation, unit="meters")
calculate_optical_air_mass_series ¶
calculate_optical_air_mass_series(
elevation: float,
refracted_solar_altitude_series: RefractedSolarAltitude,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
) -> OpticalAirMass
Approximate the relative optical air mass.
Approximate the relative optical air mass for a time series.
This function implements the algorithm described by Minzer et al. [1]_ and Hofierka [2]_ (equation 5) in which the relative optical air mass (unitless) is defined as follows :
m = (p/p0) / (sin h0_ref + 0.50572 (h0_ref + 6.07995)^(- 1.6364))
where :
- h0_ref is the corrected solar altitude h0 in degrees by the
atmospheric refraction component ∆h0_ref:
References
.. [1] Minzer, A., Champion, K. S. W., & Pond, H. L. (1959). The ARDC Model Atmosphere. Air Force Surveys in Geophysics, 115. AFCRL.
.. [2] Hofierka, 2002
Source code in pvgisprototype/api/irradiance/direct/optical_air_mass.py
@log_function_call
@custom_cached
@validate_with_pydantic(CalculateOpticalAirMassTimeSeriesInputModel)
def calculate_optical_air_mass_series(
elevation: float,
refracted_solar_altitude_series: RefractedSolarAltitude,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
) -> OpticalAirMass:
"""Approximate the relative optical air mass.
Approximate the relative optical air mass for a time series.
This function implements the algorithm described by Minzer et al. [1]_
and Hofierka [2]_ (equation 5) in which the relative optical air mass
(unitless) is defined as follows :
m = (p/p0) / (sin h0_ref + 0.50572 (h0_ref + 6.07995)^(- 1.6364))
where :
- h0_ref is the corrected solar altitude h0 in degrees by the
atmospheric refraction component ∆h0_ref:
References
----------
.. [1] Minzer, A., Champion, K. S. W., & Pond, H. L. (1959).
The ARDC Model Atmosphere. Air Force Surveys in Geophysics, 115. AFCRL.
.. [2] Hofierka, 2002
"""
adjusted_elevation = adjust_elevation(elevation.value)
degrees_plus_offset = refracted_solar_altitude_series.degrees + 6.07995
# hack to avoid warning/s of invalid or zero values subjected to np.power()
# clip degrees_plus_offset values to a small positive minimum
epsilon = 1e-10
safe_degrees_plus_offset = np.clip(degrees_plus_offset, a_min=epsilon, a_max=None)
power_values = np.power(safe_degrees_plus_offset, -1.6364)
optical_air_mass_series = adjusted_elevation.value / (
np.sin(refracted_solar_altitude_series.radians) # in radians for sin()
+ 0.50572 * power_values
)
log_data_fingerprint(
data=optical_air_mass_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return OpticalAirMass(
value=optical_air_mass_series,
unit=OPTICAL_AIR_MASS_UNIT,
)
rayleigh_optical_thickness ¶
This Python module is part of PVGIS' API. It implements functions to calculate the Rayleigh optical thickness.
Functions:
| Name | Description |
|---|---|
calculate_rayleigh_optical_thickness_series | Calculate the Rayleigh optical thickness. |
calculate_rayleigh_optical_thickness_series ¶
calculate_rayleigh_optical_thickness_series(
optical_air_mass_series: OpticalAirMass,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
) -> RayleighThickness
Calculate the Rayleigh optical thickness.
Calculate Rayleigh optical thickness for a time series.
Source code in pvgisprototype/api/irradiance/direct/rayleigh_optical_thickness.py
@log_function_call
@custom_cached
def calculate_rayleigh_optical_thickness_series(
optical_air_mass_series: OpticalAirMass, # OPTICAL_AIR_MASS_TIME_SERIES_DEFAULT
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
) -> RayleighThickness:
"""Calculate the Rayleigh optical thickness.
Calculate Rayleigh optical thickness for a time series.
"""
rayleigh_thickness_series = np.zeros_like(
optical_air_mass_series.value, dtype=dtype
)
smaller_than_20 = optical_air_mass_series.value <= 20
larger_than_20 = optical_air_mass_series.value > 20
rayleigh_thickness_series[smaller_than_20] = 1 / (
6.6296
+ 1.7513 * optical_air_mass_series.value[smaller_than_20]
- 0.1202 * np.power(optical_air_mass_series.value[smaller_than_20], 2)
+ 0.0065 * np.power(optical_air_mass_series.value[smaller_than_20], 3)
- 0.00013 * np.power(optical_air_mass_series.value[smaller_than_20], 4)
)
rayleigh_thickness_series[larger_than_20] = 1 / (
10.4 + 0.718 * optical_air_mass_series.value[larger_than_20]
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=rayleigh_thickness_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return RayleighThickness(
value=rayleigh_thickness_series,
unit=RAYLEIGH_OPTICAL_THICKNESS_UNIT,
)
refraction ¶
This Python module is part of PVGIS' API. It implements functions to calculate the refracted solar altitude.
Functions:
| Name | Description |
|---|---|
calculate_refracted_solar_altitude_series | Adjust the solar altitude angle for atmospheric refraction. |
calculate_refracted_solar_altitude_series ¶
calculate_refracted_solar_altitude_series(
solar_altitude_series: SolarAltitude,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = 0,
log: int = 0,
) -> RefractedSolarAltitude
Adjust the solar altitude angle for atmospheric refraction.
Adjust the solar altitude angle for atmospheric refraction for a time series.
Notes
This function : - requires solar altitude values in degrees. - The output should expectedly be of the same dtype as the input solar_altitude_series array.
Source code in pvgisprototype/api/irradiance/direct/refraction.py
@log_function_call
@custom_cached
def calculate_refracted_solar_altitude_series(
solar_altitude_series: SolarAltitude,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = 0,
log: int = 0,
) -> RefractedSolarAltitude:
"""Adjust the solar altitude angle for atmospheric refraction.
Adjust the solar altitude angle for atmospheric refraction for a time
series.
Notes
-----
This function :
- requires solar altitude values in degrees.
- The output _should_ expectedly be of the same `dtype` as the input
`solar_altitude_series` array.
"""
atmospheric_refraction = (
0.061359
* (
0.1594
+ 1.123 * solar_altitude_series.degrees
+ 0.065656 * np.power(solar_altitude_series.degrees, 2)
)
/ (
1
+ 28.9344 * solar_altitude_series.degrees
+ 277.3971 * np.power(solar_altitude_series.degrees, 2)
)
)
refracted_solar_altitude_series = (
solar_altitude_series.degrees + atmospheric_refraction
)
# The refracted solar altitude
# is used to calculate the optical air mass as per Kasten, 1989
# In the "Revised optical air mass tables", the solar altitude denoted by
# 'γ' ranges from 0 to 90 degrees.
# refracted_solar_altitude_series = np.clip(
# refracted_solar_altitude_series, 0, 90
# )
log_data_fingerprint(
data=refracted_solar_altitude_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return RefractedSolarAltitude(
value=refracted_solar_altitude_series, # ensure output is of input dtype !
unit=DEGREES,
)
effective ¶
Functions:
| Name | Description |
|---|---|
calculate_spectrally_corrected_effective_irradiance | Calculate the effective irradiance after the spectral effect |
calculate_spectrally_corrected_effective_irradiance ¶
calculate_spectrally_corrected_effective_irradiance(
irradiance_series: InclinedIrradiance,
spectral_factor_series: SpectralFactorSeries = SpectralFactorSeries(
value=SPECTRAL_FACTOR_DEFAULT
),
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> EffectiveIrradiance
Calculate the effective irradiance after the spectral effect
Calculate the effective irradiance by applying the spectral effect factor/s on the inclined global irradiance and before change/s due to the reflectivity effect.
Source code in pvgisprototype/api/irradiance/effective.py
@log_function_call
def calculate_spectrally_corrected_effective_irradiance(
irradiance_series: InclinedIrradiance,
spectral_factor_series: SpectralFactorSeries = SpectralFactorSeries(
value=SPECTRAL_FACTOR_DEFAULT
),
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> EffectiveIrradiance:
"""Calculate the effective irradiance after the spectral effect
Calculate the effective irradiance by applying the spectral effect factor/s
on the inclined global irradiance and before change/s due to the
reflectivity effect.
"""
# A stub for the effective irradiance series used in the output dictionary
# array_parameters = {
# "shape": irradiance_series.shape,
# "dtype": dtype,
# "init_method": "empty",
# "backend": array_backend,
# }
# effective_irradiance_series = create_array(**array_parameters)
# The following is programmatically more "expensive" in order to
# re-use the `irradiance_series` to avoid a possibly unbound variable !
effective_irradiance_series = irradiance_series * spectral_factor_series.value
spectral_effect_series = irradiance_series - (
irradiance_series / spectral_factor_series.value
)
# --------------------------------------------------- Is this safe ? -
with np.errstate(divide="ignore", invalid="ignore"):
spectral_effect_percentage_series = 100 * where(
irradiance_series != 0,
(effective_irradiance_series - irradiance_series) / irradiance_series,
0,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=effective_irradiance_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return EffectiveIrradiance(
value=effective_irradiance_series,
spectral_factor=spectral_factor_series,
spectral_effect=spectral_effect_series,
spectral_effect_percentage=spectral_effect_percentage_series,
spectral_factor_algorithm="",
)
extraterrestrial ¶
Modules:
| Name | Description |
|---|---|
horizontal | |
normal | |
horizontal ¶
Functions:
| Name | Description |
|---|---|
calculate_extraterrestrial_horizontal_irradiance | Calculate the normal extraterrestrial irradiance over a period of time |
calculate_extraterrestrial_horizontal_irradiance ¶
calculate_extraterrestrial_horizontal_irradiance(
longitude: float,
latitude: float,
extraterrestrial_normal_irradiance: (
ExtraterrestrialNormalIrradiance | None
) = ExtraterrestrialNormalIrradiance(),
timestamps: DatetimeIndex | None = DatetimeIndex(
[now(tz="UTC")]
),
timezone: ZoneInfo | None = None,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
fingerprint: bool = False,
) -> ndarray | dict
Calculate the normal extraterrestrial irradiance over a period of time
Notes
Symbol in ... G0
Source code in pvgisprototype/api/irradiance/extraterrestrial/horizontal.py
@log_function_call
@custom_cached
def calculate_extraterrestrial_horizontal_irradiance(
longitude: float,
latitude: float,
extraterrestrial_normal_irradiance: ExtraterrestrialNormalIrradiance | None = ExtraterrestrialNormalIrradiance(),
timestamps: DatetimeIndex | None = DatetimeIndex([Timestamp.now(tz='UTC')]),
timezone: ZoneInfo | None = None,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
# refracted_solar_zenith: (
# float | None
# ) = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT, # radians
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
# solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
fingerprint: bool = False,
) -> ndarray | dict:
"""
Calculate the normal extraterrestrial irradiance over a period of time
Notes
-----
Symbol in ... `G0`
"""
# build reusable parameter dictionaries
earth_orbit = {
'eccentricity_phase_offset': eccentricity_phase_offset,
'eccentricity_amplitude': eccentricity_amplitude,
}
array_parameters = {
"dtype": dtype,
"array_backend": array_backend,
}
output_parameters = {
'verbose': verbose, # Is this wanted here ? i.e. not setting = 0 ?
'log': log,
}
if extraterrestrial_normal_irradiance is None:
extraterrestrial_normal_irradiance = (
calculate_extraterrestrial_normal_irradiance_hofierka(
timestamps=timestamps,
solar_constant=solar_constant,
**earth_orbit,
**array_parameters,
**output_parameters,
)
)
solar_altitude_series = model_solar_altitude_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
# solar_time_model=solar_time_model,
**earth_orbit,
**array_parameters,
**output_parameters,
)
extraterrestrial_horizontal_irradiance = (
calculate_extraterrestrial_horizontal_irradiance_series_hofierka(
extraterrestrial_normal_irradiance=extraterrestrial_normal_irradiance,
solar_altitude_series=solar_altitude_series,
)
)
extraterrestrial_horizontal_irradiance.build_output(verbose, fingerprint)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=extraterrestrial_horizontal_irradiance.value,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return extraterrestrial_horizontal_irradiance
normal ¶
Functions:
| Name | Description |
|---|---|
calculate_extraterrestrial_normal_irradiance_series | Calculate the normal extraterrestrial irradiance over a period of time |
calculate_extraterrestrial_normal_irradiance_series ¶
calculate_extraterrestrial_normal_irradiance_series(
timestamps: DatetimeIndex | None,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
fingerprint: bool = False,
) -> ndarray | dict
Calculate the normal extraterrestrial irradiance over a period of time
Notes
Symbol in ... G0
Source code in pvgisprototype/api/irradiance/extraterrestrial/normal.py
@log_function_call
@custom_cached
def calculate_extraterrestrial_normal_irradiance_series(
timestamps: DatetimeIndex | None,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
fingerprint: bool = False,
) -> ndarray | dict:
"""
Calculate the normal extraterrestrial irradiance over a period of time
Notes
-----
Symbol in ... `G0`
"""
extraterrestrial_normal_irradiance_series = (
calculate_extraterrestrial_normal_irradiance_hofierka(
timestamps=timestamps,
solar_constant=solar_constant,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
)
extraterrestrial_normal_irradiance_series.build_output(verbose, fingerprint)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=extraterrestrial_normal_irradiance_series.value,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return extraterrestrial_normal_irradiance_series
limits ¶
Functions:
| Name | Description |
|---|---|
calculate_limits | Calculate physically possible and extremely rare limits |
calculate_limits ¶
calculate_limits(
solar_zenith: float,
air_temperature: float,
limits_dictionary: dict,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = VERBOSE_LEVEL_DEFAULT,
)
Calculate physically possible and extremely rare limits
Sum SW = [Diffuse SW + (Direct Normal SW) X µ0] Global SWdn: SW measured by unshaded pyranometer Diffuse SW: SW measured by shaded pyranometer Direct Normal SW: direct normal component of SW Direct SW: direct normal SW times the cosine of SZA; [(Direct Normal SW) x µ0] LWdn: downwelling LW measured by a pyrgeometer LWup: upwelling LW measured by a pyrgeometer
Notes
BSRN Global Network recommended QC tests, V2.0, C. N. Long and E. G. Dutton See : https://bsrn.awi.de/fileadmin/user_upload/bsrn.awi.de/Publications/BSRN_recommended_QC_tests_V2.pdf
Source code in pvgisprototype/api/irradiance/limits.py
@log_function_call
def calculate_limits(
solar_zenith: float,
air_temperature: float,
limits_dictionary: dict,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = VERBOSE_LEVEL_DEFAULT,
):
"""Calculate physically possible and extremely rare limits
Sum SW = [Diffuse SW + (Direct Normal SW) X µ0]
Global SWdn: SW measured by unshaded pyranometer
Diffuse SW: SW measured by shaded pyranometer
Direct Normal SW: direct normal component of SW
Direct SW: direct normal SW times the cosine of SZA; [(Direct Normal SW) x µ0]
LWdn: downwelling LW measured by a pyrgeometer
LWup: upwelling LW measured by a pyrgeometer
Notes
-----
BSRN Global Network recommended QC tests, V2.0, C. N. Long and E. G. Dutton
See : https://bsrn.awi.de/fileadmin/user_upload/bsrn.awi.de/Publications/BSRN_recommended_QC_tests_V2.pdf
"""
if not (170 < air_temperature < 350):
raise ValueError("Air temperature must range in [170, 350] K")
mu0 = np.array(np.cos(np.radians(solar_zenith)))
mu0[solar_zenith > 90] = 0.0 # Set to 0 if solar_zenith > 90 degrees
earth_sun_distance = SOLAR_CONSTANT / (AU**2)
calculated_limits = {}
for key, value in limits_dictionary.items():
calculated_limits[key] = {"Min": value["Min"]}
if callable(value["Max"]):
calculated_limits[key]["Max"] = value["Max"](earth_sun_distance, mu0)
else:
calculated_limits[key]["Max"] = value["Max"]
log_data_fingerprint(
data=calculated_limits,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return calculated_limits
reflectivity ¶
Functions:
| Name | Description |
|---|---|
apply_reflectivity_factor_for_nondirect_irradiance | Apply the reflectivity effect on the input non-direct irradiance |
apply_reflectivity_factor_for_nondirect_irradiance ¶
apply_reflectivity_factor_for_nondirect_irradiance(
ground_reflected_inclined_irradiance_series: DiffuseGroundReflectedInclinedIrradiance,
surface_tilt: float = SURFACE_TILT_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
) -> DiffuseGroundReflectedInclinedIrradiance
Apply the reflectivity effect on the input non-direct irradiance
The total loss due to the reflectivity effect (which depends on the solar incidence angle) as the difference of the irradiance after and before.
Notes
See relevant function/s under algorithms/martin_ruiz.
Source code in pvgisprototype/api/irradiance/reflectivity.py
@log_function_call
def apply_reflectivity_factor_for_nondirect_irradiance(
ground_reflected_inclined_irradiance_series: DiffuseGroundReflectedInclinedIrradiance,
surface_tilt: float = SURFACE_TILT_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
) -> DiffuseGroundReflectedInclinedIrradiance:
"""Apply the reflectivity effect on the input non-direct irradiance
The total loss due to the reflectivity effect (which depends on the solar
incidence angle) as the difference of the irradiance after and before.
Notes
-----
See relevant function/s under `algorithms/martin_ruiz`.
"""
logger.info(
f"Applying reflectivity loss!",
alt=f"[orange][code]Applying reflectivity loss![/code][/orange]"
)
# A single reflectivity coefficient
ground_reflected_irradiance_reflectivity_coefficient = sin(surface_tilt) + (
surface_tilt - sin(surface_tilt)
) / (1 - cos(surface_tilt))
# The reflectivity factor
ground_reflected_inclined_irradiance_reflectivity_factor = calculate_reflectivity_factor_for_nondirect_irradiance(
indirect_angular_loss_coefficient=ground_reflected_irradiance_reflectivity_coefficient,
)
# Following is data-model specific, consult the corresponding YAML file !
# Generate a time series
ground_reflected_inclined_irradiance_series.reflectivity_factor = create_array(
ground_reflected_inclined_irradiance_series.value.shape,
dtype=dtype,
init_method=ground_reflected_inclined_irradiance_reflectivity_factor,
backend=array_backend,
)
# Apply the reflectivity time series
ground_reflected_inclined_irradiance_series.value *= (
ground_reflected_inclined_irradiance_series.reflectivity_factor
)
# What is the unmodified quantity ?
ground_reflected_inclined_irradiance_series.value_before_reflectivity = where(
ground_reflected_inclined_irradiance_series.reflectivity_factor != 0,
ground_reflected_inclined_irradiance_series.value
/ ground_reflected_inclined_irradiance_series.reflectivity_factor,
0,
)
# The net effect
ground_reflected_inclined_irradiance_series.reflected = calculate_reflectivity_effect(
irradiance=ground_reflected_inclined_irradiance_series.value_before_reflectivity,
reflectivity_factor=ground_reflected_inclined_irradiance_series.reflectivity_factor,
)
# Percentage of the net effect
ground_reflected_inclined_irradiance_series.reflected_percentage = calculate_reflectivity_effect_percentage(
irradiance=ground_reflected_inclined_irradiance_series.value_before_reflectivity,
reflectivity=ground_reflected_inclined_irradiance_series.reflectivity_factor,
)
return ground_reflected_inclined_irradiance_series
shortwave ¶
Modules:
| Name | Description |
|---|---|
horizontal | API module to calculate the global (shortwave) irradiance over a |
inclined | API module to calculate the global (shortwave) irradiance over a |
horizontal ¶
API module to calculate the global (shortwave) irradiance over a location for a period in time.
Functions:
| Name | Description |
|---|---|
calculate_global_horizontal_irradiance_series | Calculate the clear-sky global horizontal irradiance (GHI) |
calculate_global_horizontal_irradiance_series ¶
calculate_global_horizontal_irradiance_series(
longitude: float,
latitude: float,
elevation: float,
timestamps: datetime | None = None,
timezone: ZoneInfo | None = None,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = True,
solar_position_model: SolarPositionModel = noaa,
solar_time_model: SolarTimeModel = noaa,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
)
Calculate the clear-sky global horizontal irradiance (GHI)
The global horizontal irradiance represents the total amount of shortwave radiation received from above by a surface horizontal to the ground. It includes both the direct and the diffuse solar radiation.
Source code in pvgisprototype/api/irradiance/shortwave/horizontal.py
@log_function_call
def calculate_global_horizontal_irradiance_series(
longitude: float,
latitude: float,
elevation: float,
timestamps: datetime | None = None,
timezone: ZoneInfo | None = None,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = True,
# unrefracted_solar_zenith: UnrefractedSolarZenith | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT, # radians
solar_position_model: SolarPositionModel = SolarPositionModel.noaa,
solar_time_model: SolarTimeModel = SolarTimeModel.noaa,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
):
"""
Calculate the clear-sky global horizontal irradiance (GHI)
The global horizontal irradiance represents the total amount of shortwave
radiation received from above by a surface horizontal to the ground. It
includes both the direct and the diffuse solar radiation.
"""
if verbose > 0:
logger.debug(
":information: Modelling direct horizontal irradiance...",
alt=":information: [bold][magenta]Modelling[/magenta] direct horizontal irradiance[/bold]...",
)
direct_horizontal_irradiance_series = calculate_clear_sky_direct_horizontal_irradiance_series(
longitude=longitude, # required by some of the solar time algorithms
latitude=latitude,
elevation=elevation,
timestamps=timestamps,
timezone=timezone,
solar_time_model=solar_time_model,
solar_position_model=solar_position_model,
linke_turbidity_factor_series=linke_turbidity_factor_series,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
solar_constant=solar_constant,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
horizon_profile=horizon_profile,
shading_model=shading_model,
# angle_output_units=angle_output_units,
dtype=dtype,
array_backend=array_backend,
validate_output=validate_output,
verbose=verbose,
log=log,
fingerprint=fingerprint,
)
diffuse_horizontal_irradiance_series = (
calculate_clear_sky_diffuse_horizontal_irradiance(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
linke_turbidity_factor_series=linke_turbidity_factor_series,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
solar_position_model=solar_position_model,
solar_time_model=solar_time_model,
solar_constant=solar_constant,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
# angle_output_units=angle_output_units,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
fingerprint=fingerprint,
)
)
global_horizontal_irradiance_series = (
direct_horizontal_irradiance_series.value
+ diffuse_horizontal_irradiance_series.value
)
out_of_range, out_of_range_index = identify_values_out_of_range(
series=global_horizontal_irradiance_series,
shape=timestamps.shape,
data_model=GlobalHorizontalIrradiance(),
)
global_horizontal_irradiance_series = GlobalHorizontalIrradiance(
value=global_horizontal_irradiance_series,
out_of_range=out_of_range,
out_of_range_index=out_of_range_index,
direct_horizontal_irradiance=direct_horizontal_irradiance_series,
extraterrestrial_normal_irradiance=diffuse_horizontal_irradiance_series.extraterrestrial_normal_irradiance,
linke_turbidity_factor=linke_turbidity_factor_series,
# value_before_reflectivity=diffuse_inclined_irradiance_before_reflectivity_series if diffuse_inclined_irradiance_before_reflectivity_series is not None else NOT_AVAILABLE,
# reflectivity_factor= diffuse_irradiance_reflectivity_factor_series if diffuse_inclined_irradiance_before_reflectivity_series is not None else NOT_AVAILABLE,
# surface_in_shade=direct_horizontal_irradiance_series.surface_in_shade, ### We Need This !
# shading_states=shading_states,
# shading_state=shading_state_series,
diffuse_horizontal_irradiance=diffuse_horizontal_irradiance_series,
# diffuse_sky_irradiance=diffuse_sky_irradiance_series,
# term_n=n_series,
# kb_ratio=kb_series,
solar_positioning_algorithm=direct_horizontal_irradiance_series.solar_altitude.solar_positioning_algorithm,
solar_timing_algorithm=direct_horizontal_irradiance_series.solar_altitude.solar_timing_algorithm,
# shading_algorithm=shading_model,
elevation=elevation,
# surface_orientation=surface_orientation,
# surface_tilt=surface_tilt,
solar_altitude=direct_horizontal_irradiance_series.solar_altitude,
# solar_azimuth=direct_horizontal_irradiance_series.solar_azimuth,
# azimuth_difference=azimuth_difference_series,
adjusted_for_atmospheric_refraction=direct_horizontal_irradiance_series.solar_altitude.adjusted_for_atmospheric_refraction,
# solar_incidence=solar_incidence_series,
# solar_incidence_model=solar_incidence_series.incidence_algorithm,
# solar_incidence_definition=solar_incidence_series.definition,
solar_radiation_model=HOFIERKA_2002,
data_source=HOFIERKA_2002,
)
# ==========================================================================
# Following do not affect calculations, yet important are they for the output !
# Perhaps find a way to "hide" them ?
global_horizontal_irradiance_series.angle_output_units = angle_output_units
# global_horizontal_irradiance_series.solar_altitude = getattr(
# solar_altitude_series, angle_output_units
# )
# global_horizontal_irradiance_series.adjusted_for_atmospheric_refraction = (
# solar_altitude_series.adjusted_for_atmospheric_refraction
# )
# ==========================================================================
global_horizontal_irradiance_series.build_output(verbose, fingerprint)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=global_horizontal_irradiance_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return global_horizontal_irradiance_series
inclined ¶
API module to calculate the global (shortwave) irradiance over a location for a period in time.
Functions:
| Name | Description |
|---|---|
calculate_global_inclined_irradiance | |
calculate_global_inclined_irradiance ¶
calculate_global_inclined_irradiance(
longitude: float,
latitude: float,
elevation: float,
surface_orientation: SurfaceOrientation = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt = SURFACE_TILT_DEFAULT,
surface_tilt_horizontally_flat_panel_threshold: float = SURFACE_TILT_HORIZONTALLY_FLAT_PANEL_THRESHOLD,
timestamps: DatetimeIndex = DatetimeIndex(
[now(tz="UTC")]
),
timezone: ZoneInfo | None = ZoneInfo("UTC"),
global_horizontal_irradiance: ndarray | None = None,
direct_horizontal_irradiance: ndarray | None = None,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = True,
albedo: float | None = ALBEDO_DEFAULT,
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = noaa,
sun_horizon_position: List[
SunHorizonPositionModel
] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_model: SolarIncidenceModel = jenco,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
shading_states: List[ShadingState] = [all],
solar_time_model: SolarTimeModel = noaa,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> GlobalInclinedIrradiance
Source code in pvgisprototype/api/irradiance/shortwave/inclined.py
@log_function_call
def calculate_global_inclined_irradiance(
longitude: float,
latitude: float,
elevation: float,
surface_orientation: SurfaceOrientation = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt = SURFACE_TILT_DEFAULT,
surface_tilt_horizontally_flat_panel_threshold: float = SURFACE_TILT_HORIZONTALLY_FLAT_PANEL_THRESHOLD,
timestamps: DatetimeIndex = DatetimeIndex([Timestamp.now(tz='UTC')]),
timezone: ZoneInfo | None = ZoneInfo("UTC"),
global_horizontal_irradiance: ndarray | None = None,
direct_horizontal_irradiance: ndarray | None = None,
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = True,
# unrefracted_solar_zenith: UnrefractedSolarZenith | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT, # radians
albedo: float | None = ALBEDO_DEFAULT,
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SolarPositionModel.noaa,
sun_horizon_position: List[SunHorizonPositionModel] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_model: SolarIncidenceModel = SolarIncidenceModel.jenco,
# complementary_incidence_angle: bool = COMPLEMENTARY_INCIDENCE_ANGLE_DEFAULT,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
shading_states: List[ShadingState] = [ShadingState.all],
solar_time_model: SolarTimeModel = SolarTimeModel.noaa,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
# angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> GlobalInclinedIrradiance:
"""
"""
# build reusable parameter dictionaries
coordinates = {
'longitude': longitude,
'latitude': latitude,
}
location_arguments = build_location_dictionary(
**coordinates,
elevation=elevation,
)
time = {
'timestamps': timestamps,
'timezone': timezone,
}
horizontal_irradiance = {
'global_horizontal_irradiance': global_horizontal_irradiance,
'direct_horizontal_irradiance': direct_horizontal_irradiance,
}
solar_positioning = {
'solar_position_model': solar_position_model,
'adjust_for_atmospheric_refraction': adjust_for_atmospheric_refraction,
'solar_time_model': solar_time_model,
}
surface_position = {
'surface_orientation': surface_orientation,
'surface_tilt': surface_tilt,
}
earth_orbit = {
'eccentricity_phase_offset': eccentricity_phase_offset,
'eccentricity_amplitude': eccentricity_amplitude,
}
array_parameters = {
"dtype": dtype,
"array_backend": array_backend,
}
output_parameters = {
'verbose': verbose, # Is this wanted here ? i.e. not setting = 0 ?
'log': log,
}
# Some quantities are not always required, hence set them to avoid UnboundLocalError!
extended_array_parameters = {
"shape": timestamps.shape,
"dtype": dtype,
"init_method": "empty",
"backend": array_backend,
} # Borrow shape from timestamps
solar_azimuth_series = SolarAzimuth(
value=create_array(**extended_array_parameters),
unit="Unitless",
origin='Not Required!',
)
solar_incidence_series = model_solar_incidence_series(
**coordinates,
**time,
**surface_position,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
solar_time_model=solar_time_model,
solar_position_model=solar_position_model,
sun_horizon_position=sun_horizon_position,
solar_incidence_model=solar_incidence_model,
horizon_profile=horizon_profile,
shading_model=shading_model,
# complementary_incidence_angle=complementary_incidence_angle,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
validate_output=validate_output,
verbose=verbose,
log=log,
# solar_incidence_model=solar_incidence_model,
# complementary_incidence_angle=True, # = Sun-vector To Surface-plane (Jenčo, 1992) !
# zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
# **earth_orbit,
# **array_parameters,
# **output_parameters,
)
# Calculate quantities required : ---------------------------- >>> >>> >>>
# 1. to model the diffuse horizontal irradiance [optional]
# 2. to calculate the diffuse sky ... to consider shaded, sunlit and potentially sunlit surfaces
# extraterrestrial on a horizontal surface requires the solar altitude
solar_altitude_series = model_solar_altitude_series(
**coordinates,
**time,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
# solar_time_model=solar_time_model,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
# angle_output_units=angle_output_units,
**array_parameters,
validate_output=validate_output,
**output_parameters,
)
# Calculate quantities required : ---------------------------- <<< <<< <<<
if surface_tilt > surface_tilt_horizontally_flat_panel_threshold: # tilted (or inclined) surface
# requires the solar incidence angle for shading and times of sunlit surface
solar_incidence_series = model_solar_incidence_series(
**coordinates,
**time,
**surface_position,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
solar_incidence_model=solar_incidence_model,
horizon_profile=horizon_profile,
shading_model=shading_model,
complementary_incidence_angle=True, # True = between sun-vector and surface-plane !
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
**earth_orbit,
**array_parameters,
validate_output=validate_output,
**output_parameters,
)
# Potentially sunlit surface series : solar altitude < 0.1 radians (or < 5.7 degrees)
if numpy.any(solar_altitude_series.radians < 0.1): # requires the solar azimuth
solar_azimuth_series = model_solar_azimuth_series(
**coordinates,
**time,
**solar_positioning,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
**earth_orbit,
verbose=verbose,
)
surface_in_shade_series = model_surface_in_shade_series(
horizon_profile=horizon_profile,
**coordinates,
**time,
**solar_positioning,
shading_model=shading_model,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
**earth_orbit,
**array_parameters,
validate_output=validate_output,
**output_parameters,
)
if isinstance(global_horizontal_irradiance, ndarray) and isinstance(
direct_horizontal_irradiance, ndarray
):
global_inclined_irradiance_series = calculate_global_inclined_irradiance_hofierka(
longitude=longitude,
latitude=latitude,
elevation=elevation,
timestamps=timestamps,
timezone=timezone,
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
**horizontal_irradiance,
linke_turbidity_factor_series=linke_turbidity_factor_series,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
albedo=albedo,
apply_reflectivity_factor=apply_reflectivity_factor,
solar_incidence_series=solar_incidence_series,
solar_altitude_series=solar_altitude_series,
solar_azimuth_series=solar_azimuth_series,
**solar_positioning,
sun_horizon_position=sun_horizon_position,
surface_in_shade_series=surface_in_shade_series,
shading_states=shading_states,
# solar_incidence_model=solar_incidence_model,
# zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
# horizon_profile=horizon_profile,
# shading_model=shading_model,
solar_constant=solar_constant,
**earth_orbit,
**array_parameters,
validate_output=validate_output,
**output_parameters,
# angle_output_units=angle_output_units,
fingerprint=fingerprint,
)
else:
global_inclined_irradiance_series = calculate_clear_sky_global_inclined_irradiance_hofierka(
**location_arguments,
**surface_position,
surface_tilt_horizontally_flat_panel_threshold=surface_tilt_horizontally_flat_panel_threshold,
**time,
**horizontal_irradiance,
linke_turbidity_factor_series=linke_turbidity_factor_series,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
albedo=albedo,
apply_reflectivity_factor=apply_reflectivity_factor,
solar_incidence_series=solar_incidence_series,
solar_altitude_series=solar_altitude_series,
solar_azimuth_series=solar_azimuth_series,
**solar_positioning,
sun_horizon_position=sun_horizon_position,
surface_in_shade_series=surface_in_shade_series,
shading_states=shading_states,
solar_constant=solar_constant,
**earth_orbit,
**array_parameters,
validate_output=validate_output,
**output_parameters,
# angle_output_units=angle_output_units,
fingerprint=fingerprint,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
global_inclined_irradiance_series.build_output(verbose, fingerprint)
log_data_fingerprint(
data=global_inclined_irradiance_series.value,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return global_inclined_irradiance_series
performance ¶
Modules:
| Name | Description |
|---|---|
analysis | |
helpers | |
report | |
summarise | |
analysis ¶
Functions:
| Name | Description |
|---|---|
analyse_photovoltaic_performance | Analyze the photovoltaic performance from time-series data. |
analyse_photovoltaic_performance ¶
analyse_photovoltaic_performance(
dictionary,
timestamps: DatetimeIndex | Timestamp,
frequency: str,
rounding_places=1,
dtype=DATA_TYPE_DEFAULT,
array_backend=ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Analyze the photovoltaic performance from time-series data.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
dictionary | | required | |
timestamps | DatetimeIndex | Timestamp | | required |
frequency | str | | required |
rounding_places | | 1 | |
dtype | | DATA_TYPE_DEFAULT |
Returns:
| Type | Description |
|---|---|
A dictionary with performance analysis results. | |
Notes
Workflow
In-Plane Irradiance
┌───────────┘ │ Reflectivity Effect └┐───────────────── ▼
In-Plane Irradiance After Reflectivity Loss [ also referred to as inlined irradiance ]
┌───────────┘ │ Spectral Effect └┐─────────────── ▼
Effective Irradiance
┌───────────┘ │ Temp. & Low Irradiance Coefficients └┐─────────────────────────────────── ▼
Effective Power
┌───────────┘ │ System Loss └┐─────────── ▼
Photovoltaic Power Output
Total Change
Notes
System efficiency
Currently, the (default) system efficiency dictionary.system_efficiency is a single and constant floating point number. Nevertheless, we convert it to a series for the following reasons :
-
to make it easier for the function
calculate_mean_of_series_per_time_unit()to derive the quantity 'system_efficiency_effect_mean' : essentially, the functionpolars.DataFrame()expects all input "data series" to be of the same length. -
to support scenarios of a fine-grained system efficiency time series
Source code in pvgisprototype/api/performance/analysis.py
def analyse_photovoltaic_performance(
dictionary,
timestamps: DatetimeIndex | Timestamp,
frequency: str,
rounding_places=1,
dtype=DATA_TYPE_DEFAULT,
array_backend=ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
"""Analyze the photovoltaic performance from time-series data.
Parameters
----------
dictionary :
Data containing time-series for various PV metrics.
timestamps :
Timestamps corresponding to the data points.
frequency :
Resampling frequency for mean calculations.
rounding_places :
Decimal places for rounding results.
dtype :
Data type for numerical calculations.
Returns
-------
A dictionary with performance analysis results.
Notes
-----
Workflow
In-Plane Irradiance
┌───────────┘
│ Reflectivity Effect
└┐─────────────────
▼
In-Plane Irradiance After Reflectivity Loss
[ also referred to as inlined irradiance ]
┌───────────┘
│ Spectral Effect
└┐───────────────
▼
Effective Irradiance
┌───────────┘
│ Temp. & Low Irradiance Coefficients
└┐───────────────────────────────────
▼
Effective Power
┌───────────┘
│ System Loss
└┐───────────
▼
Photovoltaic Power Output
------------
Total Change
------------
Notes
-----
System efficiency
Currently, the (default) system efficiency `dictionary.system_efficiency`
is a single and constant floating point number. Nevertheless, we convert
it to a series for the following reasons :
1. to make it easier for the function
`calculate_mean_of_series_per_time_unit()` to derive the quantity
'system_efficiency_effect_mean' : essentially, the function
`polars.DataFrame()` expects all input "data series" to be of the same
length.
2. to support scenarios of a fine-grained system efficiency time series
"""
# In-Plane irradiance (before effects)
# ------------------------------------------------------------------------
# To Do : In-Plane "Irradiation" ?
# Add Standard Deviation in kWh ? Monthly, Yearly ?
# ------------------------------------------------------------------------
inclined_irradiance_series = dictionary.global_inclined_before_reflectivity
inclined_irradiance, inclined_irradiance_mean, inclined_irradiance_std, _ = (
calculate_statistics(
inclined_irradiance_series,
timestamps,
frequency,
1,
rounding_places,
)
)
# Reflectivity
reflected_series = dictionary.global_inclined_reflected
(
reflectivity_effect,
reflectivity_effect_mean,
reflectivity_effect_std,
reflectivity_effect_percentage,
) = calculate_statistics(
reflected_series,
timestamps,
frequency,
inclined_irradiance,
rounding_places,
)
# After reflectivity
irradiance_after_reflectivity = inclined_irradiance + reflectivity_effect
irradiance_after_reflectivity_mean = calculate_mean_of_series_per_time_unit(
inclined_irradiance_series + reflected_series,
timestamps=timestamps,
frequency=frequency,
)
# Spectral effect
spectral_effect_series = dictionary.spectral_effect
(
spectral_effect,
spectral_effect_mean,
spectral_effect_std,
spectral_effect_percentage,
) = calculate_statistics(
spectral_effect_series,
timestamps,
frequency,
irradiance_after_reflectivity,
rounding_places,
)
effective_irradiance = irradiance_after_reflectivity + spectral_effect
# effective_irradiance_percentage = (
# (effective_irradiance / inclined_irradiance * 100)
# if inclined_irradiance != 0
# else 0
# )
effective_irradiance_mean = (
irradiance_after_reflectivity_mean + spectral_effect_mean
)
effective_irradiance_effect = effective_irradiance - inclined_irradiance
# with numpy.errstate(divide="ignore", invalid="ignore"): # if irradiance == 0
# effective_irradiance_effect_percentage = where(
# inclined_irradiance != 0,
# 100 * effective_irradiance_effect / inclined_irradiance,
# 0,
# ).item() # get a Python float
# "Effective" Power without System Loss
photovoltaic_power_without_system_loss_series = dictionary.photovoltaic_power_without_system_loss
(
photovoltaic_power_without_system_loss,
photovoltaic_power_without_system_loss_mean,
photovoltaic_power_without_system_loss_std,
_,
) = calculate_statistics(
photovoltaic_power_without_system_loss_series,
timestamps,
frequency,
reference_series=1,
rounding_places=rounding_places,
dtype=dtype,
array_backend=array_backend,
)
# Temperature & Low Irradiance
photovoltaic_power_rating_model = dictionary.power_model
temperature_and_low_irradiance_effect = (
photovoltaic_power_without_system_loss - effective_irradiance
)
temperature_and_low_irradiance_effect_mean = (
photovoltaic_power_without_system_loss_mean - effective_irradiance_mean
)
with numpy.errstate(divide="ignore", invalid="ignore"): # if irradiance == 0
temperature_and_low_irradiance_effect_percentage = where(
effective_irradiance != 0,
100
* temperature_and_low_irradiance_effect
/ numpy.array(
effective_irradiance
), # still need to handle the case when effective_irradiance == 0 <-- single float
0,
).item() # get a Python float
# System efficiency _series_ -- see Notes in the docstring
array_parameters = {
"shape": timestamps.shape,
"dtype": dtype,
"init_method": dictionary.system_efficiency, # or 'empty' ?
"backend": array_backend,
} # Borrow shape from timestamps
system_efficiency_series = create_array(**array_parameters)
system_efficiency = numpy.nanmedian(system_efficiency_series).astype(dtype)
system_efficiency_effect = numpy.array(
photovoltaic_power_without_system_loss * system_efficiency
- photovoltaic_power_without_system_loss,
dtype=dtype
).item() # Important !
system_efficiency_effect_mean = calculate_mean_of_series_per_time_unit(
photovoltaic_power_without_system_loss_mean * system_efficiency_series
- photovoltaic_power_without_system_loss_mean,
timestamps=timestamps,
frequency=frequency,
)
with numpy.errstate(divide="ignore", invalid="ignore"): # if irradiance == 0
system_efficiency_effect_percentage = where(
photovoltaic_power_without_system_loss != 0,
100
* system_efficiency_effect
/ numpy.array(
photovoltaic_power_without_system_loss
), # still need to handle the case when photovoltaic_power_without_system_loss == 0 <-- single float
0,
).item() # get a Python float
# Photovoltaic Power
photovoltaic_power_series = dictionary.value
photovoltaic_power, photovoltaic_power_mean, photovoltaic_power_std, _ = (
calculate_statistics(
photovoltaic_power_series,
timestamps,
frequency,
1,
rounding_places,
)
)
peak_power = dictionary.peak_power
photovoltaic_energy = photovoltaic_power * peak_power
photovoltaic_energy_mean = photovoltaic_power_mean * peak_power
# Total effect
total_effect = photovoltaic_power - inclined_irradiance
total_effect_mean = photovoltaic_power_mean - inclined_irradiance_mean
with numpy.errstate(divide="ignore", invalid="ignore"): # if irradiance == 0
total_effect_percentage = where(
inclined_irradiance != 0,
total_effect
/ numpy.array(inclined_irradiance)
* 100, # still need to handle the case when inclined_irradiance == 0 <-- single floar
0,
).item()
# Handle units
inclined_irradiance, inclined_irradiance_unit = kilofy_unit(
inclined_irradiance, IRRADIANCE_UNIT
)
inclined_irradiance_mean, inclined_irradiance_mean_unit = kilofy_unit(
inclined_irradiance_mean, IRRADIANCE_UNIT
)
reflectivity_effect, reflectivity_effect_unit = kilofy_unit(
reflectivity_effect, IRRADIANCE_UNIT
)
reflectivity_effect_mean, reflectivity_effect_mean_unit = kilofy_unit(
reflectivity_effect_mean, IRRADIANCE_UNIT
)
irradiance_after_reflectivity, irradiance_after_reflectivity_unit = kilofy_unit(
irradiance_after_reflectivity, IRRADIANCE_UNIT
)
irradiance_after_reflectivity_mean, irradiance_after_reflectivity_mean_unit = (
kilofy_unit(irradiance_after_reflectivity_mean, IRRADIANCE_UNIT)
)
spectral_effect, spectral_effect_unit = kilofy_unit(
spectral_effect, IRRADIANCE_UNIT
)
spectral_effect_mean, spectral_effect_mean_unit = kilofy_unit(
spectral_effect_mean, IRRADIANCE_UNIT
)
effective_irradiance, effective_irradiance_unit = kilofy_unit(
effective_irradiance, IRRADIANCE_UNIT
)
effective_irradiance_mean, effective_irradiance_mean_unit = kilofy_unit(
effective_irradiance_mean, IRRADIANCE_UNIT
)
(
photovoltaic_power_without_system_loss,
photovoltaic_power_without_system_loss_unit,
) = kilofy_unit(photovoltaic_power_without_system_loss, PHOTOVOLTAIC_POWER_UNIT)
(
photovoltaic_power_without_system_loss_mean,
photovoltaic_power_without_system_loss_mean_unit,
) = kilofy_unit(
photovoltaic_power_without_system_loss_mean, PHOTOVOLTAIC_POWER_UNIT
)
(
temperature_and_low_irradiance_effect,
temperature_and_low_irradiance_effect_unit,
) = kilofy_unit(temperature_and_low_irradiance_effect, IRRADIANCE_UNIT)
(
temperature_and_low_irradiance_effect_mean,
temperature_and_low_irradiance_effect_mean_unit,
) = kilofy_unit(temperature_and_low_irradiance_effect_mean, IRRADIANCE_UNIT)
system_efficiency_effect, system_efficiency_effect_unit = kilofy_unit(
system_efficiency_effect, PHOTOVOLTAIC_POWER_UNIT
)
system_efficiency_effect_mean, system_efficiency_effect_mean_unit = kilofy_unit(
system_efficiency_effect_mean, PHOTOVOLTAIC_POWER_UNIT
)
photovoltaic_power, photovoltaic_power_unit = kilofy_unit(
photovoltaic_power, PHOTOVOLTAIC_POWER_UNIT
)
photovoltaic_power_mean, photovoltaic_power_mean_unit = kilofy_unit(
photovoltaic_power_mean, PHOTOVOLTAIC_POWER_UNIT
)
photovoltaic_energy, photovoltaic_energy_unit = kilofy_unit(
photovoltaic_energy, PHOTOVOLTAIC_ENERGY_UNIT
)
photovoltaic_energy_mean, photovoltaic_energy_mean_unit = kilofy_unit(
photovoltaic_energy_mean, PHOTOVOLTAIC_ENERGY_UNIT
)
total_effect, total_effect_unit = kilofy_unit(total_effect, IRRADIANCE_UNIT)
total_effect_mean, total_effect_mean_unit = kilofy_unit(
total_effect_mean, IRRADIANCE_UNIT
)
performance_analysis = {
TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: inclined_irradiance,
UNIT_FOR_TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: inclined_irradiance_unit,
MEAN_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: inclined_irradiance_mean,
UNIT_FOR_MEAN_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: inclined_irradiance_mean_unit,
STANDARD_DEVIATION_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: inclined_irradiance_std,
GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: inclined_irradiance_series,
#
TOTAL_REFLECTIVITY_EFFECT_COLUMN_NAME: reflectivity_effect,
UNIT_FOR_TOTAL_REFLECTIVITY_EFFECT_COLUMN_NAME: reflectivity_effect_unit,
MEAN_REFLECTIVITY_EFFECT_COLUMN_NAME: reflectivity_effect_mean,
UNIT_FOR_MEAN_REFLECTIVITY_EFFECT_COLUMN_NAME: reflectivity_effect_mean_unit,
STANDARD_DEVIATION_REFLECTIVITY_EFFECT_COLUMN_NAME: reflectivity_effect_std,
REFLECTIVITY_EFFECT_PERCENTAGE_COLUMN_NAME: reflectivity_effect_percentage,
REFLECTIVITY_COLUMN_NAME: reflected_series,
#
GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME: irradiance_after_reflectivity,
UNIT_FOR_GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME: irradiance_after_reflectivity_unit,
MEAN_GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME: irradiance_after_reflectivity_mean,
UNIT_FOR_MEAN_GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME: irradiance_after_reflectivity_mean_unit,
#
TOTAL_SPECTRAL_EFFECT_COLUMN_NAME: spectral_effect,
UNIT_FOR_TOTAL_SPECTRAL_EFFECT_COLUMN_NAME: spectral_effect_unit,
MEAN_SPECTRAL_EFFECT_COLUMN_NAME: spectral_effect_mean,
STANDARD_DEVIATION_SPECTRAL_EFFECT_COLUMN_NAME: spectral_effect_std,
SPECTRAL_EFFECT_PERCENTAGE_COLUMN_NAME: spectral_effect_percentage,
SPECTRAL_EFFECT_COLUMN_NAME: spectral_effect_series,
#
EFFECTIVE_IRRADIANCE_COLUMN_NAME: effective_irradiance,
UNIT_FOR_EFFECTIVE_IRRADIANCE_COLUMN_NAME: effective_irradiance_unit,
MEAN_EFFECTIVE_IRRADIANCE_COLUMN_NAME: effective_irradiance_mean,
UNIT_FOR_MEAN_EFFECTIVE_IRRADIANCE_COLUMN_NAME: effective_irradiance_mean_unit,
#
TOTAL_TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_COLUMN_NAME: temperature_and_low_irradiance_effect,
UNIT_FOR_TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_COLUMN_NAME: temperature_and_low_irradiance_effect_unit,
MEAN_TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_COLUMN_NAME: temperature_and_low_irradiance_effect_mean,
UNIT_FOR_MEAN_TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_COLUMN_NAME: temperature_and_low_irradiance_effect_mean_unit,
TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_PERCENTAGE_COLUMN_NAME: temperature_and_low_irradiance_effect_percentage,
#
TOTAL_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME: photovoltaic_power_without_system_loss,
UNIT_FOR_TOTAL_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME: photovoltaic_power_without_system_loss_unit,
MEAN_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME: photovoltaic_power_without_system_loss_mean,
UNIT_FOR_MEAN_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME: photovoltaic_power_without_system_loss_mean_unit,
STANDARD_DEVIATION_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME: photovoltaic_power_without_system_loss_std,
PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME: photovoltaic_power_without_system_loss_series,
#
TOTAL_SYSTEM_EFFICIENCY_EFFECT_COLUMN_NAME: system_efficiency_effect,
UNIT_FOR_TOTAL_SYSTEM_EFFICIENCY_EFFECT_COLUMN_NAME: system_efficiency_effect_unit,
MEAN_SYSTEM_EFFICIENCY_EFFECT_COLUMN_NAME: system_efficiency_effect_mean,
UNIT_FOR_MEAN_SYSTEM_EFFICIENCY_EFFECT_COLUMN_NAME: system_efficiency_effect_mean_unit,
SYSTEM_EFFICIENCY_EFFECT_PERCENTAGE_COLUMN_NAME: system_efficiency_effect_percentage,
#
TOTAL_PHOTOVOLTAIC_POWER_COLUMN_NAME: photovoltaic_power,
UNIT_FOR_TOTAL_PHOTOVOLTAIC_POWER_COLUMN_NAME: photovoltaic_power_unit,
MEAN_PHOTOVOLTAIC_POWER_COLUMN_NAME: photovoltaic_power_mean,
UNIT_FOR_MEAN_PHOTOVOLTAIC_POWER_COLUMN_NAME: photovoltaic_power_mean_unit,
STANDARD_DEVIATION_PHOTOVOLTAIC_POWER_COLUMN_NAME: photovoltaic_power_std,
PHOTOVOLTAIC_POWER_COLUMN_NAME: photovoltaic_power_series,
#
PHOTOVOLTAIC_ENERGY_COLUMN_NAME: photovoltaic_energy,
UNIT_FOR_PHOTOVOLTAIC_ENERGY_COLUMN_NAME: photovoltaic_energy_unit,
MEAN_PHOTOVOLTAIC_ENERGY_COLUMN_NAME: photovoltaic_energy_mean,
UNIT_FOR_MEAN_PHOTOVOLTAIC_ENERGY_COLUMN_NAME: photovoltaic_energy_mean_unit,
#
TOTAL_EFFECT_COLUMN_NAME: total_effect,
UNIT_FOR_TOTAL_EFFECT_COLUMN_NAME: total_effect_unit,
MEAN_EFFECT_COLUMN_NAME: total_effect_mean,
UNIT_FOR_MEAN_EFFECT_COLUMN_NAME: total_effect_mean_unit,
EFFECT_PERCENTAGE_COLUMN_NAME: total_effect_percentage,
}
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return performance_analysis
helpers ¶
Functions:
| Name | Description |
|---|---|
kilofy_unit | Converts the unit of a given value to its kilo-equivalent if the |
kilofy_unit ¶
Converts the unit of a given value to its kilo-equivalent if the absolute value is greater than or equal to 1000.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
value | float | The numerical value to potentially convert. | required |
unit | str | The current unit of the value, defaulting to 'W' (Watts). | 'W' |
Returns:
| Name | Type | Description |
|---|---|---|
tuple | The converted value and its unit. If the value is 1000 or more, it converts the value and changes the unit to 'kW' (kilowatts). |
Examples:
Source code in pvgisprototype/api/performance/helpers.py
def kilofy_unit(value, unit="W", threshold=1000):
"""Converts the unit of a given value to its kilo-equivalent if the
absolute value is greater than or equal to 1000.
Parameters
----------
value : float
The numerical value to potentially convert.
unit : str
The current unit of the value, defaulting to 'W' (Watts).
Returns
-------
tuple :
The converted value and its unit. If the value is 1000 or more, it
converts the value and changes the unit to 'kW' (kilowatts).
Examples
--------
>>> kilofy_unit(1500, "W", 1000)
(1.5, "kW")
>>> kilofy_unit(500, "W", 1000)
(500, "W")
"""
if value is not None:
if abs(value) >= threshold and unit == IRRADIANCE_UNIT:
return value / 1000, IRRADIANCE_UNIT_K # update to kilo
if abs(value) >= threshold and unit == POWER_UNIT:
return value / 1000, POWER_UNIT_K # update to kilo
if abs(value) >= threshold and unit == ENERGY_UNIT:
return value / 1000, ENERGY_UNIT_K # update to kilo
return value, unit
report ¶
Functions:
| Name | Description |
|---|---|
report_photovoltaic_performance | |
report_photovoltaic_performance ¶
report_photovoltaic_performance(
dictionary,
timestamps: DatetimeIndex | Timestamp,
frequency: Frequency = Hourly,
rounding_places: int = ROUNDING_PLACES_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/performance/report.py
def report_photovoltaic_performance(
dictionary,
timestamps: DatetimeIndex | Timestamp,
frequency: Frequency = Frequency.Hourly,
rounding_places: int = ROUNDING_PLACES_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
""" """
photovoltaic_performance_analysis = analyse_photovoltaic_performance(
dictionary=dictionary,
timestamps=timestamps,
frequency=frequency,
rounding_places=rounding_places,
dtype=dtype,
array_backend=ARRAY_BACKEND_DEFAULT,
verbose=verbose,
)
inclined_irradiance = photovoltaic_performance_analysis.get(
TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME, None
)
inclined_irradiance_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME, None
)
inclined_irradiance_mean = photovoltaic_performance_analysis.get(
MEAN_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME, None
)
inclined_irradiance_mean_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_MEAN_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME, None
)
inclined_irradiance_std = photovoltaic_performance_analysis.get(
STANDARD_DEVIATION_GLOBAL_IN_PLANE_IRRADIANCE_COLUMN_NAME, None
)
inclined_irradiance_series = photovoltaic_performance_analysis.get(
GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME, None
)
reflectivity_change = photovoltaic_performance_analysis.get(
TOTAL_REFLECTIVITY_EFFECT_COLUMN_NAME, None
)
reflectivity_change_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_TOTAL_REFLECTIVITY_EFFECT_COLUMN_NAME, None
)
reflectivity_change_mean = photovoltaic_performance_analysis.get(
MEAN_REFLECTIVITY_EFFECT_COLUMN_NAME, None
)
reflectivity_change_mean_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_MEAN_REFLECTIVITY_EFFECT_COLUMN_NAME, None
)
reflectivity_change_std = photovoltaic_performance_analysis.get(
STANDARD_DEVIATION_REFLECTIVITY_EFFECT_COLUMN_NAME, None
)
reflectivity_change_percentage = photovoltaic_performance_analysis.get(
REFLECTIVITY_EFFECT_PERCENTAGE_COLUMN_NAME, None
)
reflectivity_series = photovoltaic_performance_analysis.get(
REFLECTIVITY_COLUMN_NAME, None
)
irradiance_after_reflectivity = photovoltaic_performance_analysis.get(
GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME, None
)
irradiance_after_reflectivity_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME, None
)
irradiance_after_reflectivity_mean = photovoltaic_performance_analysis.get(
MEAN_GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME, None
)
irradiance_after_reflectivity_mean_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME, None
)
spectral_effect = photovoltaic_performance_analysis.get(
TOTAL_SPECTRAL_EFFECT_COLUMN_NAME, None
)
spectral_effect_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_TOTAL_SPECTRAL_EFFECT_COLUMN_NAME, None
)
spectral_effect_mean = photovoltaic_performance_analysis.get(
MEAN_SPECTRAL_EFFECT_COLUMN_NAME, None
)
spectral_effect_mean_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_MEAN_SPECTRAL_EFFECT_COLUMN_NAME, None
)
spectral_effect_std = photovoltaic_performance_analysis.get(
STANDARD_DEVIATION_SPECTRAL_EFFECT_COLUMN_NAME, None
)
spectral_effect_percentage = photovoltaic_performance_analysis.get(
SPECTRAL_EFFECT_PERCENTAGE_COLUMN_NAME, None
)
spectral_effect_series = photovoltaic_performance_analysis.get(
SPECTRAL_EFFECT_COLUMN_NAME, None
)
effective_irradiance = photovoltaic_performance_analysis.get(
EFFECTIVE_IRRADIANCE_COLUMN_NAME, None
)
effective_irradiance_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_EFFECTIVE_IRRADIANCE_COLUMN_NAME, None
)
effective_irradiance_mean = photovoltaic_performance_analysis.get(
MEAN_EFFECTIVE_IRRADIANCE_COLUMN_NAME, None
)
effective_irradiance_mean_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_MEAN_EFFECTIVE_IRRADIANCE_COLUMN_NAME, None
)
temperature_and_low_irradiance_change = photovoltaic_performance_analysis.get(
TOTAL_TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_COLUMN_NAME, None
)
temperature_and_low_irradiance_change_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_COLUMN_NAME, None
)
temperature_and_low_irradiance_change_mean = photovoltaic_performance_analysis.get(
MEAN_TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_COLUMN_NAME, None
)
temperature_and_low_irradiance_change_mean_unit = (
photovoltaic_performance_analysis.get(
UNIT_FOR_MEAN_TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_COLUMN_NAME, None
)
)
temperature_and_low_irradiance_change_percentage = (
photovoltaic_performance_analysis.get(
TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_PERCENTAGE_COLUMN_NAME, None
)
)
photovoltaic_power_without_system_loss = photovoltaic_performance_analysis.get(
TOTAL_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME, None
)
photovoltaic_power_without_system_loss_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_TOTAL_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME, None
)
photovoltaic_power_without_system_loss_mean = photovoltaic_performance_analysis.get(
MEAN_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME, None
)
photovoltaic_power_without_system_loss_mean_unit = (
photovoltaic_performance_analysis.get(
UNIT_FOR_MEAN_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME, None
)
)
photovoltaic_power_without_system_loss_std = photovoltaic_performance_analysis.get(
STANDARD_DEVIATION_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME, None
)
photovoltaic_power_without_system_loss_series = (
photovoltaic_performance_analysis.get(
PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME, None
)
)
system_efficiency_change = photovoltaic_performance_analysis.get(
TOTAL_SYSTEM_EFFICIENCY_EFFECT_COLUMN_NAME, None
)
system_efficiency_change_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_TOTAL_SYSTEM_EFFICIENCY_EFFECT_COLUMN_NAME, None
)
system_efficiency_change_mean = photovoltaic_performance_analysis.get(
MEAN_SYSTEM_EFFICIENCY_EFFECT_COLUMN_NAME, None
)
system_efficiency_change_mean_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_MEAN_SYSTEM_EFFICIENCY_EFFECT_COLUMN_NAME, None
)
system_efficiency_change_percentage = photovoltaic_performance_analysis.get(
SYSTEM_EFFICIENCY_EFFECT_PERCENTAGE_COLUMN_NAME, None
)
photovoltaic_power = photovoltaic_performance_analysis.get(
TOTAL_PHOTOVOLTAIC_POWER_COLUMN_NAME, None
)
photovoltaic_power_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_TOTAL_PHOTOVOLTAIC_POWER_COLUMN_NAME, None
)
photovoltaic_power_mean = photovoltaic_performance_analysis.get(
MEAN_PHOTOVOLTAIC_POWER_COLUMN_NAME, None
)
photovoltaic_power_mean_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_MEAN_PHOTOVOLTAIC_POWER_COLUMN_NAME, None
)
photovoltaic_power_std = photovoltaic_performance_analysis.get(
STANDARD_DEVIATION_PHOTOVOLTAIC_POWER_COLUMN_NAME, None
)
photovoltaic_power_series = photovoltaic_performance_analysis.get(
PHOTOVOLTAIC_POWER_COLUMN_NAME, None
)
photovoltaic_energy = photovoltaic_performance_analysis.get(
PHOTOVOLTAIC_ENERGY_COLUMN_NAME, None
)
photovoltaic_energy_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_PHOTOVOLTAIC_ENERGY_COLUMN_NAME, None
)
photovoltaic_energy_mean = photovoltaic_performance_analysis.get(
MEAN_PHOTOVOLTAIC_ENERGY_COLUMN_NAME, None
)
photovoltaic_energy_mean_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_MEAN_PHOTOVOLTAIC_ENERGY_COLUMN_NAME, None
)
total_change = photovoltaic_performance_analysis.get(TOTAL_EFFECT_COLUMN_NAME, None)
total_change_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_TOTAL_EFFECT_COLUMN_NAME, None
)
total_change_mean = photovoltaic_performance_analysis.get(
MEAN_EFFECT_COLUMN_NAME, None
)
total_change_mean_unit = photovoltaic_performance_analysis.get(
UNIT_FOR_MEAN_EFFECT_COLUMN_NAME, None
)
total_change_percentage = photovoltaic_performance_analysis.get(
EFFECT_PERCENTAGE_COLUMN_NAME, None
)
return {
f"[bold purple]{IN_PLANE_IRRADIANCE}": ( # Label
(inclined_irradiance, "bold purple"), # Value, Style
(inclined_irradiance_unit, "purple"),
(inclined_irradiance_mean, "bold purple"), # Mean Value, Style
(inclined_irradiance_mean_unit, "purple"),
inclined_irradiance_std,
None, # %
"bold", # Style for
None, # f"100 {GLOBAL_IRRADIANCE_NAME}", # % of (which) Quantity
inclined_irradiance_series, # input series
None, # source
),
f"{REFLECTIVITY}": (
(reflectivity_change, "magenta"),
(reflectivity_change_unit, "cyan dim"),
(reflectivity_change_mean, "magenta"),
(reflectivity_change_mean_unit, "cyan dim"),
reflectivity_change_std,
reflectivity_change_percentage,
"bold",
IN_PLANE_IRRADIANCE,
reflectivity_series,
None,
),
f"[white dim]{IRRADIANCE_AFTER_REFLECTIVITY}": (
(irradiance_after_reflectivity, "white dim"),
(irradiance_after_reflectivity_unit, "white dim"),
(irradiance_after_reflectivity_mean, "white dim"),
(irradiance_after_reflectivity_mean_unit, "white dim"),
None,
None,
"bold",
IN_PLANE_IRRADIANCE,
numpy.array([], dtype=dtype),
None,
),
f"{SPECTRAL_EFFECT_NAME}": (
(spectral_effect, "magenta"),
(spectral_effect_unit, "cyan dim"),
(spectral_effect_mean, "magenta"),
(spectral_effect_mean_unit, "cyan dim"),
spectral_effect_std,
spectral_effect_percentage,
"bold",
IN_PLANE_IRRADIANCE,
spectral_effect_series,
None,
),
f"[white dim]{EFFECTIVE_IRRADIANCE_NAME}": (
(effective_irradiance, "white dim"),
(effective_irradiance_unit, "white dim"),
(effective_irradiance_mean, "white dim"),
(effective_irradiance_mean_unit, "white dim"),
None,
None,
"bold",
None,
numpy.array([]),
None,
),
f"{TEMPERATURE_AND_LOW_IRRADIANCE_COLUMN_NAME}": (
(temperature_and_low_irradiance_change, "magenta"),
(temperature_and_low_irradiance_change_unit, "cyan dim"),
(temperature_and_low_irradiance_change_mean, "magenta"),
(temperature_and_low_irradiance_change_mean_unit, "cyan dim"),
None,
temperature_and_low_irradiance_change_percentage,
"bold",
EFFECTIVE_IRRADIANCE_NAME,
numpy.array([]),
None,
),
f"[white dim]{PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME}": (
(photovoltaic_power_without_system_loss, "white dim"),
(photovoltaic_power_without_system_loss_unit, "white dim"),
(photovoltaic_power_without_system_loss_mean, "white dim"),
(photovoltaic_power_without_system_loss_mean_unit, "white dim"),
photovoltaic_power_without_system_loss_std,
None,
"bold",
EFFECTIVE_IRRADIANCE_NAME,
photovoltaic_power_without_system_loss_series,
None,
),
f"{SYSTEM_LOSS}": (
(system_efficiency_change, "magenta"),
(system_efficiency_change_unit, "cyan dim"),
(system_efficiency_change_mean, "magenta"),
(system_efficiency_change_mean_unit, "cyan dim"),
None,
system_efficiency_change_percentage,
"bold",
PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME,
numpy.array([]),
None,
),
f"[white dim]{POWER_NAME_WITH_SYMBOL}": (
(photovoltaic_power, "white dim"),
(photovoltaic_power_unit, "white dim"),
(photovoltaic_power_mean, "white dim"),
(photovoltaic_power_mean_unit, "white dim"),
photovoltaic_power_std,
None,
"bold",
EFFECTIVE_IRRADIANCE_NAME,
photovoltaic_power_series,
None,
),
f"[green bold]{ENERGY_NAME_WITH_SYMBOL}": (
(photovoltaic_energy, "green"),
(photovoltaic_energy_unit, "green"),
(photovoltaic_energy_mean, "bold green"),
(photovoltaic_energy_mean_unit, "green"),
None,
None,
"bold",
EFFECTIVE_IRRADIANCE_NAME,
numpy.array([]),
None,
),
f"[white dim]{NET_EFFECT}": (
(total_change, "white dim"),
(total_change_unit, "white dim"),
(total_change_mean, "white dim"),
(total_change_mean_unit, "white dim"),
None,
total_change_percentage,
"dim",
IN_PLANE_IRRADIANCE,
numpy.array([]),
None,
),
}
summarise ¶
Functions:
| Name | Description |
|---|---|
summarise_photovoltaic_performance | Generate a simplified report for photovoltaic performance, focusing only on quantities and their values. |
summarise_photovoltaic_performance ¶
summarise_photovoltaic_performance(
dictionary: dict,
longitude,
latitude,
elevation,
timestamps: DatetimeIndex | Timestamp,
frequency: Frequency = Hourly,
analysis: AnalysisLevel = Simple,
angle_output_units: str = RADIANS,
rounding_places: int = ROUNDING_PLACES_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Generate a simplified report for photovoltaic performance, focusing only on quantities and their values.
Source code in pvgisprototype/api/performance/summarise.py
def summarise_photovoltaic_performance(
dictionary: dict,
longitude,
latitude,
elevation,
# surface_orientation: bool = True,
# surface_tilt: bool = True,
timestamps: DatetimeIndex | Timestamp,
frequency: Frequency = Frequency.Hourly,
analysis: AnalysisLevel = AnalysisLevel.Simple,
angle_output_units: str = RADIANS,
rounding_places: int = ROUNDING_PLACES_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
"""
Generate a simplified report for photovoltaic performance, focusing only on quantities and their values.
"""
from pvgisprototype.api.utilities.conversions import round_float_values
latitude = round_float_values(
latitude, rounding_places
)
# position_table.add_row(f"{LATITUDE_NAME}", f"[bold]{latitude}[/bold]")
longitude = round_float_values(
longitude, rounding_places
)
# surface_orientation = (
# dictionary.get(SURFACE_ORIENTATION_COLUMN_NAME, None)
# if surface_orientation
# else None
# )
# surface_orientation = round_float_values(
# surface_orientation, rounding_places
# )
# surface_tilt = (
# dictionary.get(SURFACE_TILT_COLUMN_NAME, None) if surface_tilt else None
# )
# surface_tilt = round_float_values(surface_tilt, rounding_places)
photovoltaic_performance_analysis = analyse_photovoltaic_performance(
dictionary=dictionary,
timestamps=timestamps,
frequency=frequency,
rounding_places=rounding_places,
dtype=dtype,
array_backend=ARRAY_BACKEND_DEFAULT,
verbose=verbose,
)
photovoltaic_module, mount_type = dictionary.technology.split(":")
peak_power = photovoltaic_performance_analysis.get(PEAK_POWER_COLUMN_NAME, None)
def get_value(value_key, unit_key, default=None):
value = photovoltaic_performance_analysis.get(value_key, default)
unit = photovoltaic_performance_analysis.get(unit_key, default)
return {"value": value, "unit": unit}
# longitude = convert_float_to_degrees_if_requested(longitude, angle_output_units)
# latitude = convert_float_to_degrees_if_requested(latitude, angle_output_units)
# surface_orientation = convert_float_to_degrees_if_requested(surface_orientation, angle_output_units)
# surface_tilt = convert_float_to_degrees_if_requested(surface_tilt, angle_output_units)
performance_analysis_container = {
"Location & Position": lambda: {
LATITUDE_COLUMN_NAME: {"value": latitude, "unit": angle_output_units},
LONGITUDE_COLUMN_NAME: {"value": longitude, "unit": angle_output_units},
ELEVATION_COLUMN_NAME: {"value": elevation, "unit": "meters"},
# SURFACE_ORIENTATION_COLUMN_NAME: {
# "value": surface_orientation,
# "unit": angle_output_units,
# },
# SURFACE_TILT_COLUMN_NAME: {"value": surface_tilt, "unit": angle_output_units},
"Start time": str(timestamps[0].strftime("%Y-%m-%d %H:%M")),
"End time": str(timestamps[-1].strftime("%Y-%m-%d %H:%M")),
"Frequency": frequency,
},
"Minimal": lambda: (
{
PHOTOVOLTAIC_ENERGY_COLUMN_NAME: get_value(
PHOTOVOLTAIC_ENERGY_COLUMN_NAME,
UNIT_FOR_PHOTOVOLTAIC_ENERGY_COLUMN_NAME,
),
TECHNOLOGY_NAME: photovoltaic_module,
PEAK_POWER_COLUMN_NAME: peak_power,
"Mount type": mount_type,
}
if analysis.value == AnalysisLevel.Minimal
else {}
),
"Simple": lambda: (
{
TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: get_value(
TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME,
UNIT_FOR_TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME,
),
PHOTOVOLTAIC_ENERGY_COLUMN_NAME: get_value(
PHOTOVOLTAIC_ENERGY_COLUMN_NAME,
UNIT_FOR_PHOTOVOLTAIC_ENERGY_COLUMN_NAME,
),
TOTAL_EFFECT_COLUMN_NAME: get_value(
TOTAL_EFFECT_COLUMN_NAME,
UNIT_FOR_TOTAL_EFFECT_COLUMN_NAME,
),
TECHNOLOGY_NAME: photovoltaic_module,
PEAK_POWER_COLUMN_NAME: peak_power,
"Mount type": mount_type,
}
if analysis.value == AnalysisLevel.Simple
else {}
),
"Advanced": lambda: (
{
TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: get_value(
TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME,
UNIT_FOR_TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME,
),
GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME: get_value(
GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME,
UNIT_FOR_MEAN_GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME,
),
EFFECTIVE_IRRADIANCE_COLUMN_NAME: get_value(
EFFECTIVE_IRRADIANCE_COLUMN_NAME,
UNIT_FOR_EFFECTIVE_IRRADIANCE_COLUMN_NAME,
),
TOTAL_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME: get_value(
TOTAL_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME,
UNIT_FOR_TOTAL_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME,
),
TOTAL_PHOTOVOLTAIC_POWER_COLUMN_NAME: get_value(
TOTAL_PHOTOVOLTAIC_POWER_COLUMN_NAME,
UNIT_FOR_TOTAL_PHOTOVOLTAIC_POWER_COLUMN_NAME,
),
PHOTOVOLTAIC_ENERGY_COLUMN_NAME: get_value(
PHOTOVOLTAIC_ENERGY_COLUMN_NAME,
UNIT_FOR_PHOTOVOLTAIC_ENERGY_COLUMN_NAME,
),
TOTAL_EFFECT_COLUMN_NAME: get_value(
TOTAL_EFFECT_COLUMN_NAME,
UNIT_FOR_TOTAL_EFFECT_COLUMN_NAME,
),
TECHNOLOGY_NAME: photovoltaic_module,
PEAK_POWER_COLUMN_NAME: peak_power,
"Mount type": mount_type,
}
if analysis.value == AnalysisLevel.Advanced
else {}
),
"Extended": lambda: (
{
TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: get_value(
TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME,
UNIT_FOR_TOTAL_GLOBAL_IN_PLANE_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME,
),
TOTAL_REFLECTIVITY_EFFECT_COLUMN_NAME: get_value(
TOTAL_REFLECTIVITY_EFFECT_COLUMN_NAME,
UNIT_FOR_TOTAL_REFLECTIVITY_EFFECT_COLUMN_NAME,
),
GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME: get_value(
GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME,
UNIT_FOR_MEAN_GLOBAL_IN_PLANE_IRRADIANCE_AFTER_REFLECTIVITY_COLUMN_NAME,
),
TOTAL_SPECTRAL_EFFECT_COLUMN_NAME: get_value(
TOTAL_SPECTRAL_EFFECT_COLUMN_NAME,
UNIT_FOR_TOTAL_SPECTRAL_EFFECT_COLUMN_NAME,
),
EFFECTIVE_IRRADIANCE_COLUMN_NAME: get_value(
EFFECTIVE_IRRADIANCE_COLUMN_NAME,
UNIT_FOR_EFFECTIVE_IRRADIANCE_COLUMN_NAME,
),
TOTAL_TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_COLUMN_NAME: get_value(
TOTAL_TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_COLUMN_NAME,
UNIT_FOR_TEMPERATURE_AND_LOW_IRRADIANCE_EFFECT_COLUMN_NAME,
),
TOTAL_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME: get_value(
TOTAL_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME,
UNIT_FOR_TOTAL_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME,
),
TOTAL_SYSTEM_EFFICIENCY_EFFECT_COLUMN_NAME: get_value(
TOTAL_SYSTEM_EFFICIENCY_EFFECT_COLUMN_NAME,
UNIT_FOR_TOTAL_SYSTEM_EFFICIENCY_EFFECT_COLUMN_NAME,
),
TOTAL_PHOTOVOLTAIC_POWER_COLUMN_NAME: get_value(
TOTAL_PHOTOVOLTAIC_POWER_COLUMN_NAME,
UNIT_FOR_TOTAL_PHOTOVOLTAIC_POWER_COLUMN_NAME,
),
PHOTOVOLTAIC_ENERGY_COLUMN_NAME: get_value(
PHOTOVOLTAIC_ENERGY_COLUMN_NAME,
UNIT_FOR_PHOTOVOLTAIC_ENERGY_COLUMN_NAME,
),
TOTAL_EFFECT_COLUMN_NAME: get_value(
TOTAL_EFFECT_COLUMN_NAME,
UNIT_FOR_TOTAL_EFFECT_COLUMN_NAME,
),
TECHNOLOGY_NAME: photovoltaic_module,
PEAK_POWER_COLUMN_NAME: peak_power,
"Mount type": mount_type,
}
if analysis.value == AnalysisLevel.Extended
else {}
),
}
performance_analysis = {}
for _, level in performance_analysis_container.items():
performance_analysis.update(level())
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return performance_analysis
plot ¶
Functions:
| Name | Description |
|---|---|
convert_and_resample | Parameters |
uniplot_data_array_series | Plot time series in the terminal |
uniplot_solar_position_series | |
uniplot_spectral_factor_series | Plot spectral factor series for different module types in the terminal using the uniplot library. |
convert_and_resample ¶
convert_and_resample(
array: NpNDArray,
timestamps: DatetimeIndex,
convert_false_to_none: bool = False,
resample_large_series: bool = False,
frequency: ResampleCompatible = "1ME",
) -> DataArray
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
convert_false_to_none | bool | Convert False to None for arrays of type bool. This is meaningful in the context of Uniplot in order to plot series of Boolean values along with numeric ones. For example, plot series of surface in-shade, which is a series of True or False states, alongside other soloar position series, i.e. solar altitude. Only True entries are then visualised which shows when the solar surface in question is directly sunlit. | False |
Source code in pvgisprototype/api/plot.py
def convert_and_resample(
array: NpNDArray,
timestamps: DatetimeIndex,
convert_false_to_none: bool = False,
resample_large_series: bool = False,
frequency: ResampleCompatible = "1ME", # Sane default ?
) -> xarray.DataArray:
"""
Parameters
----------
convert_false_to_none : bool
Convert False to None for arrays of type bool. This is meaningful in
the context of Uniplot in order to plot series of Boolean values along
with numeric ones. For example, plot series of _surface in-shade_,
which is a series of True or False _states_, alongside other soloar
position series, i.e. solar altitude. Only True entries are then
visualised which shows when the solar surface in question
is directly sunlit.
"""
# Ensure array and timestamps are of the same size
if array.size != timestamps.size:
# Handle empty array case
if array.size == 0:
logger.warning("The provided array is empty!")
return xarray.DataArray([])
else:
raise ValueError(f"The size of the data array {array.size} and timestamps {timestamps.size} must match.")
# In the context of Uniplot : convert False to None
# This is meaningful to plot series of Boolean values alongside with numeric ones.
if convert_false_to_none:
if array.dtype == bool:
array = numpy.where(array, True, None)
# Create xarray DataArray with time dimension
data_array = xarray.DataArray(array, coords=[timestamps], dims=["time"])
if resample_large_series:
return data_array.resample(time=frequency).mean()
return data_array
uniplot_data_array_series ¶
uniplot_data_array_series(
data_array,
list_extra_data_arrays=None,
longitude: float | None = None,
latitude: float | None = None,
orientation: List[float] | float | None = None,
tilt: List[float] | float | None = None,
timestamps: DatetimeIndex | None = DatetimeIndex([]),
convert_false_to_none: bool = True,
resample_large_series: bool = False,
frequency: ResampleCompatible = "1ME",
lines: bool = True,
supertitle: str | None = None,
title: str | None = None,
label: str | None = None,
extra_legend_labels: List[str] | None = None,
unit: str = UNIT_NAME,
terminal_width_fraction: float = TERMINAL_WIDTH_FRACTION,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Plot time series in the terminal
Source code in pvgisprototype/api/plot.py
@log_function_call
def uniplot_data_array_series(
data_array,
list_extra_data_arrays = None,
longitude: float | None = None,
latitude: float | None = None,
orientation: List[float] | float | None = None,
tilt: List[float] | float | None = None,
# time_series_2: Path = None,
timestamps: DatetimeIndex | None = DatetimeIndex([]),
convert_false_to_none: bool = True,
resample_large_series: bool = False,
frequency: ResampleCompatible = "1ME", # Sane default ?
lines: bool = True,
supertitle: str | None = None,
title: str | None = None,
label: str | None = None,
extra_legend_labels: List[str] | None = None,
unit: str = UNIT_NAME,
terminal_width_fraction: float = TERMINAL_WIDTH_FRACTION,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
"""Plot time series in the terminal"""
import shutil
from functools import partial
from uniplot import plot as default_plot
terminal_columns, _ = shutil.get_terminal_size() # we don't need lines!
terminal_length = int(terminal_columns * terminal_width_fraction)
plot = partial(default_plot, width=terminal_length)
# Convert data_array to an Xarray DataArray, possible resample
data_array = convert_and_resample(
array=data_array,
timestamps=timestamps,
convert_false_to_none=convert_false_to_none,
resample_large_series=resample_large_series,
frequency=frequency,
)
if list_extra_data_arrays:
list_extra_data_arrays = [
convert_and_resample(
array=extra_array,
timestamps=timestamps,
convert_false_to_none=convert_false_to_none,
resample_large_series=resample_large_series,
)
for extra_array in list_extra_data_arrays
if extra_array.size > 0 # Process only non-empty arrays
]
y_series = [data_array] + (list_extra_data_arrays if list_extra_data_arrays else [])
timestamps_series = [DatetimeIndex(data_array.time)] * len(y_series) # list same DatetimeIndex for each series
if isinstance(data_array, float):
logger.error(
f"{exclamation_mark} Aborting as I cannot plot the single float value {data_array}!",
alt=f"{exclamation_mark} [red]Aborting[/red] as I [red]cannot[/red] plot the single float value {data_array}!",
)
return
if longitude and latitude:
title = (title or '') + f' observed from (longitude, latitude) {longitude}, {latitude}'
# supertitle = getattr(photovoltaic_power_output_series, 'long_name', 'Untitled')
# label = getattr(photovoltaic_power_output_series, 'name', None)
# label_2 = getattr(photovoltaic_power_output_series_2, 'name', None) if photovoltaic_power_output_series_2 is not None else None
# unit = getattr(photovoltaic_power_output_series, 'units', None)
supertitle = getattr(data_array, "long_name", "Untitled")
label = label if label else getattr(data_array, "name", None)
list_extra_data_arrays = (
list_extra_data_arrays if list_extra_data_arrays is not None else []
)
# legend_labels = [label] + [getattr(extra_array, 'name', None) for extra_array in list_extra_data_arrays]
legend_labels = [label] + (extra_legend_labels if extra_legend_labels else [])
unit = getattr(data_array, "units", None) or unit
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
infinite_values = [
# numpy.isinf(array)
numpy.isinf(array.values)
if array.dtype != "object" else False
for array in y_series
]
# Check for infinite values
if any(numpy.any(infinites) for infinites in infinite_values):
# y_series = [nan_to_num(array, nan=0.0, posinf=finfo(float32).max, neginf=-finfo(float32).max) for array in y_series]
# stub_array = full(infinite_values.shape, -1, dtype=int)
# index_array = arange(len(infinite_values))
# infinite_values_indices = where(infinite_values, index_array, stub_array)
error_message = f"Found infinite values in y_series :\n{y_series}"
error_message += f"\nMaybe it is necessary to debug the upstream functions that generated this output ?"
error_message_alternative = (
f"Found infinite values in [code]y_series[/code] :\n{y_series}"
)
error_message_alternative += f"\n[bold yellow]Maybe it is necessary to debug the upstream functions that generated this output ?[/bold yellow]"
logger.error(error_message, alt=error_message_alternative)
print("[reverse]Uniplot[/reverse]")
try:
plot(
xs=timestamps_series,
ys=y_series,
legend_labels=legend_labels,
legend_placement='auto',
lines=lines,
title=title if title else supertitle,
y_unit=" " + str(unit),
# force_ascii=True,
# color=False,
)
except IOError as e:
raise IOError(f"Could not _uniplot_ {data_array.value=}") from e
uniplot_solar_position_series ¶
uniplot_solar_position_series(
solar_position_series: dict,
position_parameters: [SolarPositionParameter] = all,
timestamps: DatetimeIndex | None = None,
timezone: ZoneInfo | None = None,
surface_orientation=None,
surface_tilt=None,
longitude: float = None,
latitude: float = None,
convert_false_to_none: bool = True,
resample_large_series: bool = False,
frequency: ResampleCompatible = None,
lines: bool = True,
supertitle: str = None,
title: str = None,
label: str = None,
legend_labels: str = None,
caption: bool = True,
terminal_width_fraction: float = TERMINAL_WIDTH_FRACTION,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/plot.py
def uniplot_solar_position_series(
solar_position_series: dict,
position_parameters: [SolarPositionParameter] = SolarPositionParameter.all,
timestamps: DatetimeIndex | None = None,
timezone: ZoneInfo | None = None,
# index: bool = False,
surface_orientation=None,
surface_tilt=None,
longitude: float = None,
latitude: float = None,
# time_series_2: Path = None,
convert_false_to_none: bool = True,
resample_large_series: bool = False,
frequency: ResampleCompatible = None,
lines: bool = True,
supertitle: str = None,
title: str = None,
label: str = None,
legend_labels: str = None,
# unit: str = UNIT_NAME,
caption: bool = True,
terminal_width_fraction: float = TERMINAL_WIDTH_FRACTION,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
"""
"""
individual_series = None
individual_series_labels = None
for model_name, model_result in solar_position_series.items():
# Important ! Flatten the structure !
model_result = flatten_dictionary(model_result)
# First, _pop_ solar incidence series, if any and not a string !
solar_incidence_series = (
model_result.pop(SolarPositionParameterColumnName.incidence, numpy.array([]))
if not isinstance(model_result.get(SolarPositionParameterColumnName.incidence), str)
else None
)
# If this is the case : adjust the label for the incidence series
if solar_incidence_series.size > 0:
label = (
f"{model_result.get(INCIDENCE_DEFINITION, NOT_AVAILABLE)} "
+ "Incidence "
+ f" ({model_result.get(INCIDENCE_ALGORITHM_NAME, NOT_AVAILABLE)})"
)
# However, and except for the overview commmand, we expect _one_ angular metric time series
if len(position_parameters) == 1:
first_position_parameter_column_name = SolarPositionParameterColumnName[position_parameters[0].name]
solar_position_metric_series = model_result.pop(first_position_parameter_column_name)
else: # pop the first item from the `model_result`
solar_position_metric_series = (
solar_incidence_series
if solar_incidence_series is not None
else model_result.pop(0)
)
# get the rest of metrics too -- Why not pop ? ReviewMe ----------
individual_series = [
# Attention : SolarPositionParameterColumnName is an Enum class !
# Here :
# `some_parameter.name` is the SolarPositionParameter's member name !'
# `SolarPositionParameter[some_parameter].value` _is_ the column name we want !
model_result.get(SolarPositionParameterColumnName[parameter.name].value, numpy.array([]))
for parameter in position_parameters
if not isinstance(model_result.get(parameter), str)
and parameter != SolarPositionParameterColumnName.incidence
]
# ----------------------------------------------------------------
individual_series_labels = []
for parameter in position_parameters:
# Note : pop-ing the parameter from position_parameters causes
# missing labels in the final "uniplot" !
# position_parameters.pop(position_parameters.index(parameter))
if (
parameter.name in SolarPositionParameterColumnName.__members__.keys()
and parameter.name != SolarPositionParameterColumnName.incidence.name
):
metric_label = SOLAR_POSITION_PARAMETER_COLUMN_NAMES[parameter]
# Add the origin-of-azimuth in the label for solar azimuth series
if parameter == SolarPositionParameterColumnName.azimuth:
metric_label = (
[
f"{label} {model_result.get(AZIMUTH_ORIGIN_NAME, NOT_AVAILABLE)}"
for label in metric_label
]
if isinstance(metric_label, list)
else f"{metric_label} {model_result.get(AZIMUTH_ORIGIN_NAME, NOT_AVAILABLE)}"
)
# extend or append to the list of labels
if isinstance(metric_label, list):
individual_series_labels.extend(metric_label)
else:
individual_series_labels.append(metric_label)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
uniplot_data_array_series(
data_array=solar_position_metric_series,
list_extra_data_arrays=individual_series,
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
convert_false_to_none=convert_false_to_none,
resample_large_series=resample_large_series,
frequency=frequency,
lines=True,
supertitle=f"{supertitle} {model_name}",
title=title,
label=label,
extra_legend_labels=individual_series_labels,
unit=model_result.get(UNIT_NAME, UNITLESS),
terminal_width_fraction=terminal_width_fraction,
verbose=verbose,
)
uniplot_spectral_factor_series ¶
uniplot_spectral_factor_series(
spectral_factor_dictionary: Dict,
spectral_factor_model: List,
photovoltaic_module_type: List,
timestamps: DatetimeIndex,
convert_false_to_none: bool = True,
resample_large_series: bool = False,
frequency: str = None,
supertitle: str = "Spectral Factor Series",
title: str = "Spectral Factor",
terminal_width_fraction: float = 0.9,
verbose: int = 0,
)
Plot spectral factor series for different module types in the terminal using the uniplot library.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
convert_false_to_none | bool | Convert False to None for arrays of type bool. This is meaningful in the context of Uniplot in order to plot series of Boolean values along with numeric ones. For example, plot series of surface in-shade, which is a series of True or False states, alongside other soloar position series, i.e. solar altitude. Only True entries are then visualised which shows when the solar surface in question is directly sunlit. | True |
Source code in pvgisprototype/api/plot.py
def uniplot_spectral_factor_series(
spectral_factor_dictionary: Dict,
spectral_factor_model: List,
photovoltaic_module_type: List,
timestamps: DatetimeIndex,
convert_false_to_none: bool = True,
resample_large_series: bool = False,
frequency: str = None,
supertitle: str = "Spectral Factor Series",
title: str = "Spectral Factor",
terminal_width_fraction: float = 0.9,
verbose: int = 0,
):
"""Plot spectral factor series for different module types in the terminal using the uniplot library.
Parameters
----------
- spectral_factor: Dictionary containing spectral factor data.
- spectral_factor_model: List of spectral factor models.
- photovoltaic_module_type: List of photovoltaic module types.
- timestamps: DatetimeIndex of the time series.
convert_false_to_none : bool
Convert False to None for arrays of type bool. This is meaningful in
the context of Uniplot in order to plot series of Boolean values along
with numeric ones. For example, plot series of _surface in-shade_,
which is a series of True or False _states_, alongside other soloar
position series, i.e. solar altitude. Only True entries are then
visualised which shows when the solar surface in question
is directly sunlit.
- resample_large_series: Whether to resample large series.
- supertitle: Supertitle for the plot.
- title: Title for the plot.
- terminal_width_fraction: Width of the terminal for plotting.
- verbose: Verbosity level.
"""
data_arrays = []
labels = []
for spectral_factor_model, result in spectral_factor_dictionary.items():
title += f" ({spectral_factor_model.value})"
for module_type in result:
spectral_factor_for_module = spectral_factor_dictionary[spectral_factor_model][
module_type
]
spectral_factor_series = spectral_factor_for_module.get(
SPECTRAL_FACTOR_COLUMN_NAME
)
# if needed
if isinstance(spectral_factor_series, memoryview):
spectral_factor_data = numpy.array(spectral_factor_series)
data_array = xarray.DataArray(
spectral_factor_series, coords=[timestamps], dims=["time"]
)
data_arrays.append(data_array)
label = f"{module_type.value}"
if len([spectral_factor_model]) > 1:
label += f" {spectral_factor_model.name}"
labels.append(label)
uniplot_data_array_series(
data_array=data_arrays[0],
list_extra_data_arrays=data_arrays[1:],
timestamps=timestamps,
convert_false_to_none=convert_false_to_none,
resample_large_series=resample_large_series,
frequency=frequency,
lines=True,
supertitle=supertitle,
title=title,
label=labels[0],
extra_legend_labels=labels[1:],
unit="",
terminal_width_fraction=terminal_width_fraction,
verbose=verbose,
)
position ¶
Modules:
| Name | Description |
|---|---|
altitude | |
azimuth | An overview of conventions and conversions from a North-based system to either |
conversions | |
declination | |
event_time | |
fractional_year | |
hour_angle | |
incidence | API modules to calculate the solar incidence angle between the direction of the |
models | |
output | |
overview | |
shading | |
solar_time | |
zenith | |
altitude ¶
Functions:
| Name | Description |
|---|---|
calculate_solar_altitude_series | Calculates the solar position using the requested models and returns the |
model_solar_altitude_series | Notes |
calculate_solar_altitude_series ¶
calculate_solar_altitude_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_position_models: List[SolarPositionModel] = [
noaa
],
adjust_for_atmospheric_refraction: bool = True,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Dict
Calculates the solar position using the requested models and returns the results in a dictionary.
Source code in pvgisprototype/api/position/altitude.py
@log_function_call
def calculate_solar_altitude_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_position_models: List[SolarPositionModel] = [SolarPositionModel.noaa],
adjust_for_atmospheric_refraction: bool = True,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Dict:
"""Calculates the solar position using the requested models and returns the
results in a dictionary.
"""
results = {}
for solar_position_model in solar_position_models:
if (
solar_position_model != SolarPositionModel.all
): # ignore 'all' in the enumeration
solar_altitude_series = model_solar_altitude_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
solar_altitude_series.build_output(
verbose=verbose,
fingerprint=fingerprint,
angle_output_units=angle_output_units,
)
solar_altitude_overview = {
solar_position_model.name: solar_altitude_series.output
}
results = results | solar_altitude_overview
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return results
model_solar_altitude_series ¶
model_solar_altitude_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex | Timestamp | None,
timezone: ZoneInfo,
solar_position_model: SolarPositionModel = noaa,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> SolarAltitude
Notes
The solar altitude angle measures from the horizon up towards the zenith (positive, and down towards the nadir (negative)). The altitude is zero all along the great circle between zenith and nadir.
- All solar calculation functions return floating angular measurements in radians.
Source code in pvgisprototype/api/position/altitude.py
@log_function_call
@custom_cached
@validate_with_pydantic(ModelSolarAltitudeTimeSeriesInputModel)
def model_solar_altitude_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex | Timestamp | None,
timezone: ZoneInfo,
solar_position_model: SolarPositionModel = SolarPositionModel.noaa,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> SolarAltitude:
"""
Notes
-----
The solar altitude angle measures from the horizon up towards the zenith
(positive, and down towards the nadir (negative)). The altitude is zero all
along the great circle between zenith and nadir.
- All solar calculation functions return floating angular measurements in
radians.
"""
logger.debug(
f"Executing solar positioning modelling function model_solar_altitude_series() for\n{timestamps}",
alt=f"Executing [underline]solar positioning modelling[/underline] function model_solar_altitude_series() for\n{timestamps}",
)
solar_altitude_series = None
if solar_position_model.value == SolarPositionModel.noaa:
solar_altitude_series = calculate_solar_altitude_series_noaa(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
if solar_position_model.value == SolarPositionModel.skyfield:
pass
# if solar_position_model.value == SolarPositionModel.skyfield:
# solar_altitude, solar_azimuth = calculate_solar_altitude_azimuth_skyfield(
# longitude=longitude,
# latitude=latitude,
# timestamp=timestamp,
# )
if solar_position_model.value == SolarPositionModel.suncalc:
pass
# if solar_position_model.value == SolarPositionModel.suncalc:
# # note : first azimuth, then altitude
# solar_azimuth_south_radians_convention, solar_altitude = suncalc.get_position(
# date=timestamp, # this comes first here!
# lng=longitude.degrees,
# lat=latitude.degrees,
# ).values() # zero points to south
# solar_altitude = SolarAltitude(
# value=solar_altitude,
# unit=RADIANS,
# solar_positioning_algorithm='suncalc',
# solar_timing_algorithm='suncalc',
# )
# if (
# not isfinite(solar_altitude.degrees)
# or not solar_altitude.min_degrees <= solar_altitude.degrees <= solar_altitude.max_degrees
# ):
# raise ValueError(
# f"The calculated solar altitude angle {solar_altitude.degrees} is out of the expected range\
# [{solar_altitude.min_degrees}, {solar_altitude.max_degrees}] degrees"
# )
if solar_position_model.value == SolarPositionModel.jenco:
solar_altitude_series = calculate_solar_altitude_series_jenco(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_position_model.value == SolarPositionModel.hofierka:
solar_altitude_series = calculate_solar_altitude_series_hofierka(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_position_model.value == SolarPositionModel.pvlib:
solar_altitude_series = calculate_solar_altitude_series_pvlib(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
logger.debug(
f"Returning solar altitude time series :\n{solar_altitude_series}",
alt=f"Returning [yellow]solar altitude[/yellow] time series :\n{solar_altitude_series}",
)
return solar_altitude_series
azimuth ¶
An overview of conventions and conversions from a North-based system to either East- or South-based systems is:
┌─────────────┐ ┌────────────┐ ┌────────────┐
│ N=0 │ │ N │ │ N │
│ ▲ │ │ ▲ │ │ ▲ │
Origin │ W ◄┼► E │ │ W ◄┼► E=0 │ │ W ◄┼► E │
│ ▼ │ │ ▼ │ │ ▼ │
│ S │ │ S │ │ S=0 │
└─────────────┘ └────────────┘ └────────────┘
┌─────────────┐ ┌────────────┐ ┌────────────┐
│ │ │ │ │ │
│ │ │ │ │ │
Input South │ 180 │ │ 90 │ │ 0 │ (IS) │ │ │ │ │ │ │ │ │ │ │ │ └─────────────┘ └────────────┘ └────────────┘ ┌─────────────┐ ┌────────────┐ ┌────────────┐ │ │ │ │ │ │ Internal │ │ │ │ │ │ │ = │ │ IS - 90 │ │ IS - 180 │ Conversion │ │ │ │ │ │ │ │ │ │ │ │ └─────────────┘ └────────────┘ └────────────┘
Functions:
| Name | Description |
|---|---|
calculate_solar_azimuth_series | Calculates the solar position using all models and returns the results in a table. |
model_solar_azimuth_series | The solar azimuth angle measures horizontally around the horizon from north |
calculate_solar_azimuth_series ¶
calculate_solar_azimuth_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_position_models: List[SolarPositionModel] = [
noaa
],
solar_time_model: SolarTimeModel = noaa,
adjust_for_atmospheric_refraction: bool = True,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Dict
Calculates the solar position using all models and returns the results in a table.
Source code in pvgisprototype/api/position/azimuth.py
def calculate_solar_azimuth_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_position_models: List[SolarPositionModel] = [SolarPositionModel.noaa],
solar_time_model: SolarTimeModel = SolarTimeModel.noaa,
adjust_for_atmospheric_refraction: bool = True,
# unrefracted_solar_zenith: UnrefractedSolarZenith | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT, # radians
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Dict:
"""
Calculates the solar position using all models and returns the results in a table.
"""
results = {}
for solar_position_model in solar_position_models:
if (
solar_position_model != SolarPositionModel.all
): # ignore 'all' in the enumeration
solar_azimuth_series = model_solar_azimuth_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
solar_time_model=solar_time_model,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
verbose=verbose,
validate_output=validate_output,
)
solar_azimuth_series.build_output(
verbose=verbose,
fingerprint=fingerprint,
angle_output_units=angle_output_units,
)
solar_azimuth_overview = {
solar_position_model.name: solar_azimuth_series.output
}
results = results | solar_azimuth_overview
return results
model_solar_azimuth_series ¶
model_solar_azimuth_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex | None,
timezone: ZoneInfo | None,
solar_position_model: SolarPositionModel = noaa,
adjust_for_atmospheric_refraction: bool = True,
solar_time_model: SolarTimeModel = milne,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> SolarAzimuth
The solar azimuth angle measures horizontally around the horizon from north through east, south, and west.
Notes
- All solar calculation functions return floating angular measurements in radians.
Source code in pvgisprototype/api/position/azimuth.py
@log_function_call
@custom_cached
@validate_with_pydantic(ModelSolarAzimuthTimeSeriesInputModel)
def model_solar_azimuth_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex | None,
timezone: ZoneInfo | None,
solar_position_model: SolarPositionModel = SolarPositionModel.noaa,
adjust_for_atmospheric_refraction: bool = True,
# unrefracted_solar_zenith: UnrefractedSolarZenith | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT, # radians
solar_time_model: SolarTimeModel = SolarTimeModel.milne,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = 0,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> SolarAzimuth:
"""
The solar azimuth angle measures horizontally around the horizon from north
through east, south, and west.
Notes
-----
- All solar calculation functions return floating angular measurements in
radians.
"""
logger.debug(
f"Executing solar positioning modelling function model_solar_azimuth_series() for\n{timestamps}",
alt=f"Executing [underline]solar positioning modelling[/underline] function model_solar_azimuth_series() for\n{timestamps}"
)
solar_azimuth_series = None
if solar_position_model.value == SolarPositionModel.noaa:
solar_azimuth_series = calculate_solar_azimuth_series_noaa(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
if solar_position_model.value == SolarPositionModel.jenco:
solar_azimuth_series = calculate_solar_azimuth_series_jenco(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_position_model.value == SolarPositionModel.skyfield:
pass
# solar_altitude, solar_azimuth = calculate_solar_altitude_azimuth_skyfield(
# longitude=longitude,
# latitude=latitude,
# timestamp=timestamp,
# )
if solar_position_model.value == SolarPositionModel.suncalc:
pass
# # note : first azimuth, then altitude
# solar_azimuth_south_radians_convention, solar_altitude = suncalc.get_position(
# date=timestamp, # this comes first here!
# lng=longitude.degrees,
# lat=latitude.degrees,
# ).values() # zero points to south
# solar_azimuth = convert_south_to_north_radians_convention(
# solar_azimuth_south_radians_convention
# )
# solar_azimuth = SolarAzimuth(
# value=solar_azimuth,
# unit=RADIANS,
# solar_positioning_algorithm='suncalc',
# solar_timing_algorithm='suncalc',
# )
if solar_position_model.value == SolarPositionModel.hofierka:
pass
# solar_azimuth = calculate_solar_azimuth_pvis(
# longitude=longitude,
# latitude=latitude,
# timestamp=timestamp,
# timezone=timezone,
# solar_time_model=solar_time_model,
# )
if solar_position_model.value == SolarPositionModel.pvlib:
solar_azimuth_series = calculate_solar_azimuth_series_pvlib(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
# log_data_fingerprint(
# data=solar_azimuth_series.value,
# log_level=log,
# hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
# )
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
logger.debug(
f"Returning solar azimuth time series :\n{solar_azimuth_series}",
alt=f"Returning [yellow]solar azimuth[/yellow] time series :\n{solar_azimuth_series}"
)
return solar_azimuth_series
conversions ¶
Functions:
| Name | Description |
|---|---|
convert_east_to_north_radians_convention | Convert an azimuth from East-based to North-based radians. |
convert_north_to_east_radians_convention | Convert an azimuth angle from North-based to East-based radians. |
convert_north_to_south_radians_convention | Convert an azimuth angle from North-based to South-based radians. |
convert_east_to_north_radians_convention ¶
Convert an azimuth from East-based to North-based radians.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
azimuth_east_radians | float | The angle with an east-based reference where East is 0 radians. | required |
Returns:
| Type | Description |
|---|---|
float | The converted angle in radians where North is 0 radians. |
Notes
This conversion adds 3π/2 to the angle and takes modulo 2π.
Source code in pvgisprototype/api/position/conversions.py
def convert_east_to_north_radians_convention(
east_based_angle: NDArray,
) -> NDArray:
"""Convert an azimuth from East-based to North-based radians.
Parameters
----------
azimuth_east_radians : float
The angle with an east-based reference where East is 0 radians.
Returns
-------
float
The converted angle in radians where North is 0 radians.
Notes
-----
This conversion adds 3π/2 to the angle and takes modulo 2π.
"""
return numpy.mod((east_based_angle + 3 * pi / 2), 2 * pi)
convert_north_to_east_radians_convention ¶
Convert an azimuth angle from North-based to East-based radians.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
north_based_angle | HasRadians | The angle with a north-based reference where North is 0 radians. | required |
Returns:
| Type | Description |
|---|---|
float | The converted angle in radians where East is 0 radians. |
Notes
This conversion subtracts π/2 from the angle and takes modulo 2π.
Source code in pvgisprototype/api/position/conversions.py
def convert_north_to_east_radians_convention(north_based_angle: HasRadians) -> NDArray:
"""Convert an azimuth angle from North-based to East-based radians.
Parameters
----------
north_based_angle : HasRadians
The angle with a north-based reference where North is 0 radians.
Returns
-------
float
The converted angle in radians where East is 0 radians.
Notes
-----
This conversion subtracts π/2 from the angle and takes modulo 2π.
"""
return numpy.mod((north_based_angle.radians - pi / 2), 2 * pi)
convert_north_to_south_radians_convention ¶
Convert an azimuth angle from North-based to South-based radians.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
north_based_angle | HasRadians | The angle with a north-based reference where North is 0 radians. | required |
Returns:
| Type | Description |
|---|---|
float | The converted angle in radians where South is 0 radians. |
Notes
This conversion subtracts π from the angle and takes modulo 2π.
Source code in pvgisprototype/api/position/conversions.py
def convert_north_to_south_radians_convention(north_based_angle: HasRadians) -> NDArray:
"""Convert an azimuth angle from North-based to South-based radians.
Parameters
----------
north_based_angle : HasRadians
The angle with a north-based reference where North is 0 radians.
Returns
-------
float
The converted angle in radians where South is 0 radians.
Notes
-----
This conversion subtracts π from the angle and takes modulo 2π.
"""
return numpy.mod((north_based_angle.radians - pi), 2 * pi)
declination ¶
Functions:
| Name | Description |
|---|---|
calculate_solar_declination_series | Calculate the solar declination angle |
model_solar_declination_series | |
calculate_solar_declination_series ¶
calculate_solar_declination_series(
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_declination_models: List[
SolarDeclinationModel
] = [pvis],
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Dict
Calculate the solar declination angle
The solar declination is the angle between the rays of the sun and the equator of the earth. It is used to calculate the solar elevation and azimuth angles.
Source code in pvgisprototype/api/position/declination.py
def calculate_solar_declination_series(
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_declination_models: List[SolarDeclinationModel] = [
SolarDeclinationModel.pvis
],
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Dict:
"""Calculate the solar declination angle
The solar declination is the angle between the rays of the sun and the
equator of the earth. It is used to calculate the solar elevation and
azimuth angles.
"""
results = {}
for solar_declination_model in solar_declination_models:
if (
solar_declination_model != SolarDeclinationModel.all
): # ignore 'all' in the enumeration
solar_declination_series = model_solar_declination_series(
timestamps=timestamps,
timezone=timezone,
solar_declination_model=solar_declination_model,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
array_backend=array_backend,
dtype=dtype,
verbose=verbose,
log=log,
validate_output=validate_output,
)
solar_declination_series.build_output(
verbose=verbose,
fingerprint=fingerprint,
angle_output_units=angle_output_units,
)
solar_declination_overview = {
solar_declination_model.name: solar_declination_series.output
}
results = results | solar_declination_overview
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return results
model_solar_declination_series ¶
model_solar_declination_series(
timestamps: DatetimeIndex,
timezone: ZoneInfo = ZoneInfo("UTC"),
solar_declination_model: SolarDeclinationModel = pvis,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = VERBOSE_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> SolarDeclination
Source code in pvgisprototype/api/position/declination.py
@log_function_call
@custom_cached
def model_solar_declination_series(
timestamps: DatetimeIndex,
timezone: ZoneInfo = ZoneInfo("UTC"),
solar_declination_model: SolarDeclinationModel = SolarDeclinationModel.pvis,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = VERBOSE_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> SolarDeclination:
""" """
solar_declination_series = None
if solar_declination_model.value == SolarDeclinationModel.noaa:
solar_declination_series = calculate_solar_declination_series_noaa(
timestamps=timestamps,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
if solar_declination_model.value == SolarDeclinationModel.pvis:
solar_declination_series = calculate_solar_declination_series_hofierka(
timestamps=timestamps,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_declination_model.value == SolarDeclinationModel.hargreaves:
solar_declination_series = calculate_solar_declination_series_hargreaves(
timestamps=timestamps,
dtype=dtype,
# array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_declination_model.value == SolarDeclinationModel.pvlib:
solar_declination_series = calculate_solar_declination_series_pvlib(
timestamps=timestamps,
# dtype=dtype,
# array_backend=array_backend,
# verbose=verbose,
# log=log,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=solar_declination_series.value,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return solar_declination_series
event_time ¶
Functions:
| Name | Description |
|---|---|
calculate_event_time_series | |
model_solar_event_time_series | |
calculate_event_time_series ¶
calculate_event_time_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo = ZoneInfo(TIMEZONE_UTC),
event: List[SolarEvent | None] = [None],
unrefracted_solar_zenith: UnrefractedSolarZenith = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
solar_position_models: List[SolarPositionModel] = [
noaa
],
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
)
Source code in pvgisprototype/api/position/event_time.py
def calculate_event_time_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo = ZoneInfo(TIMEZONE_UTC),
event: List[SolarEvent | None] = [None],
unrefracted_solar_zenith: UnrefractedSolarZenith = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
# adjust_for_atmospheric_refraction: bool = False,
# adjust_for_atmospheric_refraction: bool = False,
solar_position_models: List[SolarPositionModel] = [SolarPositionModel.noaa],
# solar_time_model: SolarTimeModel = SolarTimeModel.noaa,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
):
"""
"""
# empty_array = create_array(
# timestamps.shape, dtype="object", init_method="empty", backend=array_backend
# )
results = {}
for solar_position_model in solar_position_models:
# for the time being! ------------------------------------------------
if solar_position_model != SolarPositionModel.noaa:
logger.warning(
f"Solar geometry overview series is not implemented for the requested solar position model: {solar_position_model}!"
)
# --------------------------------------------------------------------
if (
solar_position_model != SolarPositionModel.all
): # ignore 'all' in the enumeration
solar_event_series = model_solar_event_time_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
event=event,
unrefracted_solar_zenith=unrefracted_solar_zenith,
timezone=timezone,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
solar_event_series.build_output(
verbose=verbose,
fingerprint=fingerprint,
angle_output_units=angle_output_units,
)
solar_event_overview = {
solar_position_model.name: solar_event_series.output
}
results = results | solar_event_overview
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return results
model_solar_event_time_series ¶
model_solar_event_time_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo = ZoneInfo(TIMEZONE_UTC),
event: List[SolarEvent | None] = [None],
unrefracted_solar_zenith: UnrefractedSolarZenith = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/position/event_time.py
def model_solar_event_time_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo = ZoneInfo(TIMEZONE_UTC),
event: List[SolarEvent | None] = [None],
unrefracted_solar_zenith: UnrefractedSolarZenith = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
# adjust_for_atmospheric_refraction: bool = False,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""
"""
event_time_series = calculate_solar_event_time_series_noaa(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,#.normalize().unique(),
event=event,
unrefracted_solar_zenith=unrefracted_solar_zenith,
timezone=timezone,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
return event_time_series
fractional_year ¶
hour_angle ¶
Functions:
| Name | Description |
|---|---|
calculate_event_hour_angle_series | Calculate the hour angle (ω) at sunrise and sunset |
calculate_solar_hour_angle_series | Calculate the hour angle ω' |
model_solar_hour_angle_series | |
calculate_event_hour_angle_series ¶
calculate_event_hour_angle_series(
latitude: Latitude,
surface_tilt: float = 0,
solar_declination: float = 0,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> HourAngleSunrise
Calculate the hour angle (ω) at sunrise and sunset
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
latitude | Latitude | Latitude (Φ) is the angle between the sun's rays and its projection on the horizontal surface measured in radians | required |
surface_tilt | float | Surface tilt (or slope) (β) is the angle between the inclined surface (slope) and the horizontal plane. | 0 |
solar_declination | float | Solar declination (δ) is the angle between the equator and a line drawn from the centre of the Earth to the centre of the sun measured in radians. | 0 |
Returns:
| Type | Description |
|---|---|
Tuple(float, str) | Tuple containg (hour_angle, units). Hour angle (ω) is the angle at any instant through which the earth has to turn to bring the meridian of the observer directly in line with the sun's rays measured in radian. |
Notes
Hour angle = acos( -tan * ( Latitude Angle - Tilt Angle ) * tan( Declination Angle ) )
The hour angle (ω) at sunrise and sunset measures the angular distance between the sun at the local solar time and the sun at solar noon.
ω = acos(-tan(Φ-β)*tan(δ))
Source code in pvgisprototype/api/position/hour_angle.py
@validate_with_pydantic(CalculateEventHourAngleInputModel)
def calculate_event_hour_angle_series(
latitude: Latitude,
surface_tilt: float = 0,
solar_declination: float = 0,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> HourAngleSunrise:
"""Calculate the hour angle (ω) at sunrise and sunset
Parameters
----------
latitude: float
Latitude (Φ) is the angle between the sun's rays and its projection on the
horizontal surface measured in radians
surface_tilt: float
Surface tilt (or slope) (β) is the angle between the inclined surface
(slope) and the horizontal plane.
solar_declination: float
Solar declination (δ) is the angle between the equator and a line drawn
from the centre of the Earth to the centre of the sun measured in
radians.
Returns
-------
Tuple(float, str)
Tuple containg (hour_angle, units). Hour angle (ω) is the angle at any
instant through which the earth has to turn to bring the meridian of the
observer directly in line with the sun's rays measured in radian.
Notes
-----
Hour angle = acos( -tan * ( Latitude Angle - Tilt Angle ) * tan( Declination Angle ) )
The hour angle (ω) at sunrise and sunset measures the angular distance
between the sun at the local solar time and the sun at solar noon.
ω = acos(-tan(Φ-β)*tan(δ))
"""
hour_angle_sunrise = acos(
-tan(latitude.radians - surface_tilt.radians) * tan(solar_declination.radians)
)
return HourAngleSunrise(
value=hour_angle_sunrise,
unit=RADIANS,
)
calculate_solar_hour_angle_series ¶
calculate_solar_hour_angle_series(
longitude: Longitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_position_models: List[SolarPositionModel] = [
noaa
],
solar_time_model: SolarTimeModel = noaa,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> SolarHourAngle
Calculate the hour angle ω'
ω = (ST / 3600 - 12) * 15 * pi / 180
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
solar_time | float | The solar time (ST) is a calculation of the passage of time based on the position of the Sun in the sky. It is expected to be decimal hours in a 24 hour format and measured internally in seconds. | required |
output_units | Angle output units (degrees or radians). | required |
Returns:
| Type | Description |
|---|---|
Tuple(float, str) | Tuple containg (hour_angle, units). Hour angle is the angle (ω) at any instant through which the earth has to turn to bring the meridian of the observer directly in line with the sun's rays measured in radian. |
Notes
The hour angle ω (elsewhere symbolised with h) of a point on the earth’s surface is defined as the angle through which the earth would turn to bring the meridian of the point directly under the sun. The hour angle at local solar noon is zero, with each 360/24 or 15° of longitude equivalent to 1 h, afternoon hours being designated as positive. Expressed symbolically, the hour angle in degrees is:
h = ±0.25 (Number of minutes from local solar noon)
where the plus sign applies to afternoon hours and the minus sign to morning hours.
The hour angle can also be obtained from the apparent solar time (AST); that is, the corrected local solar time:
h = (AST - 12) * 15
At local solar noon, AST = 12 and h = 0°. Therefore, from Eq <<(2.3)<<, the local solar time (LST, the time shown by our clocks at local solar noon) is:
LST = 12 - ET ∓ 4 * (SL - LL)
where:
ET is the Equation of Time
SL Standard Longitude
LL Local Longitude
Example 1
The equation for LST at local solar noon for Nicosia, Cyprus is:
LST = 12 - ET - 13.32 (minutes)
Example 2
Given the ET for March 10 (N = 69) is calculated from Eq (2.1), in which the factor B is obtained from Eq <<(2.2)<< as:
B = 360 / 364 * (N-81) = 360 / 364 * (69- 81) = -11.87
ET = 9.87 * sin(2*B) - 7.53 * cos(B) - 1.5 * sin(B) =
= 9.87 * sin(-2 * 11.87) - 7.53 * cos(-11.87) - 1.5 * sin(-11.87)
= -11.04min ∼ -11min
The standard meridian for Athens is 30°E longitude.
The apparent solar time on March 10 at 2:30 pm for the city of Athens, Greece (23°40′E longitude) is
AST = 14:30 - 4 * (30 - 23.66) - 0:11
= 14:30 - 0:25 - 0:11
= 13:54 or 1:54 pm
Additional notes:
Nomenclature from [1]_
α [°] solar altitude angle β [°] tilt angle δ [°] solar declination θ [°] solar incidence angle Φ [°] solar zenith angle h [°] hour angle L [°] local latitude N [-] day of the year z [°] solar azimuth angle ZS [°] surface azimuth angle AST Apparent Solar Time LST Local Standard Time ET Equation of Time SL Standard Longitude LL Local Longitude DS Daylight Saving
.. [1] Determination of Optimal Position of Solar Trough Collector. Available from: https://www.researchgate.net/publication/317826540_Determination_of_Optimal_Position_of_Solar_Trough_Collector [accessed Sep 06 2023].
In PVGIS : hour_angle = (solar_time / 3600 - 12) * 15 * 0.0175
which means:
- solar time is expected in seconds
- conversion to radians `* 0.0175` replaced by `pi / 180`
In this function:
Source code in pvgisprototype/api/position/hour_angle.py
def calculate_solar_hour_angle_series(
longitude: Longitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_position_models: List[SolarPositionModel] = [SolarPositionModel.noaa],
solar_time_model: SolarTimeModel = SolarTimeModel.noaa,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> SolarHourAngle:
"""Calculate the hour angle ω'
ω = (ST / 3600 - 12) * 15 * pi / 180
Parameters
----------
solar_time : float
The solar time (ST) is a calculation of the passage of time based on the
position of the Sun in the sky. It is expected to be decimal hours in a
24 hour format and measured internally in seconds.
output_units: str, optional
Angle output units (degrees or radians).
Returns
-------
Tuple(float, str)
Tuple containg (hour_angle, units). Hour angle is the angle (ω) at any
instant through which the earth has to turn to bring the meridian of the
observer directly in line with the sun's rays measured in radian.
Notes
-----
The hour angle ω (elsewhere symbolised with `h`) of a point on the earth’s
surface is defined as the angle through which the earth would turn to bring
the meridian of the point directly under the sun. The hour angle at local
solar noon is zero, with each 360/24 or 15° of longitude equivalent to 1 h,
afternoon hours being designated as positive. Expressed symbolically, the
hour angle in degrees is:
h = ±0.25 (Number of minutes from local solar noon)
where the plus sign applies to afternoon hours and the minus sign to
morning hours.
The hour angle can also be obtained from the apparent solar time (AST);
that is, the corrected local solar time:
h = (AST - 12) * 15
At local solar noon, AST = 12 and h = 0°. Therefore, from Eq <<(2.3)<<, the
local solar time (LST, the time shown by our clocks at local solar noon)
is:
LST = 12 - ET ∓ 4 * (SL - LL)
where:
ET is the Equation of Time
SL Standard Longitude
LL Local Longitude
Example 1
The equation for LST at local solar noon for Nicosia, Cyprus is:
LST = 12 - ET - 13.32 (minutes)
Example 2
Given the ET for March 10 (N = 69) is calculated from Eq (2.1), in which
the factor B is obtained from Eq <<(2.2)<< as:
B = 360 / 364 * (N-81) = 360 / 364 * (69- 81) = -11.87
ET = 9.87 * sin(2*B) - 7.53 * cos(B) - 1.5 * sin(B) =
= 9.87 * sin(-2 * 11.87) - 7.53 * cos(-11.87) - 1.5 * sin(-11.87)
= -11.04min ∼ -11min
The standard meridian for Athens is 30°E longitude.
The apparent solar time on March 10 at 2:30 pm for the city of Athens,
Greece (23°40′E longitude) is
AST = 14:30 - 4 * (30 - 23.66) - 0:11
= 14:30 - 0:25 - 0:11
= 13:54 or 1:54 pm
Additional notes:
Nomenclature from [1]_
α [°] solar altitude angle
β [°] tilt angle
δ [°] solar declination
θ [°] solar incidence angle
Φ [°] solar zenith angle
h [°] hour angle
L [°] local latitude
N [-] day of the year
z [°] solar azimuth angle
ZS [°] surface azimuth angle
AST Apparent Solar Time
LST Local Standard Time
ET Equation of Time
SL Standard Longitude
LL Local Longitude
DS Daylight Saving
.. [1] Determination of Optimal Position of Solar Trough Collector. Available from: https://www.researchgate.net/publication/317826540_Determination_of_Optimal_Position_of_Solar_Trough_Collector [accessed Sep 06 2023].
In PVGIS :
hour_angle = (solar_time / 3600 - 12) * 15 * 0.0175
which means:
- solar time is expected in seconds
- conversion to radians `* 0.0175` replaced by `pi / 180`
In this function:
"""
results = {}
for solar_position_model in solar_position_models:
# for the time being! ------------------------------------------------
if solar_position_model != SolarPositionModel.noaa:
logger.warning(
f"Solar geometry overview series is not implemented for the requested solar position model: {solar_position_model}!"
)
# --------------------------------------------------------------------
if (
solar_position_model != SolarPositionModel.all
): # ignore 'all' in the enumeration
solar_hour_angle_series = model_solar_hour_angle_series(
longitude=longitude,
# latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_time_model=solar_time_model,
solar_position_model=solar_position_model,
dtype=dtype,
array_backend=array_backend,
validate_output=validate_output,
verbose=verbose,
log=log,
)
solar_hour_angle_series.build_output(
verbose=verbose,
fingerprint=fingerprint,
angle_output_units=angle_output_units,
)
solar_hour_angle_overview = {
solar_position_model.name: solar_hour_angle_series.output
}
results = results | solar_hour_angle_overview
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return results
model_solar_hour_angle_series ¶
model_solar_hour_angle_series(
longitude: Longitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_position_model: SolarPositionModel = noaa,
solar_time_model: SolarTimeModel = milne,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> SolarHourAngle
Source code in pvgisprototype/api/position/hour_angle.py
@log_function_call
@custom_cached
# @validate_with_pydantic(CalculateSolarHourAngleTimeSeriesNOAAInput)
def model_solar_hour_angle_series(
longitude: Longitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_position_model: SolarPositionModel = SolarPositionModel.noaa,
solar_time_model: SolarTimeModel = SolarTimeModel.milne,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> SolarHourAngle:
""" """
solar_hour_angle_series = None
if solar_position_model.value == SolarPositionModel.noaa:
solar_hour_angle_series = calculate_solar_hour_angle_series_noaa(
longitude=longitude,
timestamps=timestamps,
timezone=timezone,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
if solar_position_model.value == SolarPositionModel.skyfield:
pass
if solar_position_model.value == SolarPositionModel.jenco:
pass
if solar_position_model.value == SolarPositionModel.hofierka:
solar_hour_angle_series = calculate_solar_hour_angle_series_hofierka(
longitude=longitude,
timestamps=timestamps,
timezone=timezone,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_position_model.value == SolarPositionModel.pvlib:
solar_hour_angle_series = calculate_solar_hour_angle_series_pvlib(
longitude=longitude,
timestamps=timestamps,
# timezone=timezone,
# dtype=dtype,
# array_backend=array_backend,
# verbose=verbose,
# log=log,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return solar_hour_angle_series
incidence ¶
API modules to calculate the solar incidence angle between the direction of the sun-to-surface vector and either the direction of the normal-to-surface vector or the direction of the surface-plane vector.
Attention is required im handling the rotational solar azimuth and surface orientation (also referred to as surface azimuth) anngles. The origin of measuring azimuthal angles will obvisouly impact the direction of the calculated angles. See also the API azimuth.py module.
Functions:
| Name | Description |
|---|---|
calculate_solar_incidence_series | Calculates the solar Incidence angle for the selected models and returns the results in a table |
model_solar_incidence_series | |
calculate_solar_incidence_series ¶
calculate_solar_incidence_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
surface_orientation: SurfaceOrientation = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt = SURFACE_TILT_DEFAULT,
solar_position_model: SolarPositionModel = noaa,
sun_horizon_position: List[
SunHorizonPositionModel
] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_models: List[SolarIncidenceModel] = [
iqbal
],
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
complementary_incidence_angle: bool = COMPLEMENTARY_INCIDENCE_ANGLE_DEFAULT,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> Dict
Calculates the solar Incidence angle for the selected models and returns the results in a table
Source code in pvgisprototype/api/position/incidence.py
@log_function_call
def calculate_solar_incidence_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
surface_orientation: SurfaceOrientation = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt = SURFACE_TILT_DEFAULT,
# solar_time_model: SolarTimeModel = SolarTimeModel.milne,
solar_position_model: SolarPositionModel = SolarPositionModel.noaa, # Only one !
sun_horizon_position: List[SunHorizonPositionModel] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_models: List[SolarIncidenceModel] = [SolarIncidenceModel.iqbal],
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
complementary_incidence_angle: bool = COMPLEMENTARY_INCIDENCE_ANGLE_DEFAULT,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> Dict:
"""Calculates the solar Incidence angle for the selected models and returns the results in a table"""
results = {}
for solar_incidence_model in solar_incidence_models:
if (
solar_incidence_model != SolarIncidenceModel.all
): # ignore 'all' in the enumeration
solar_incidence_series = model_solar_incidence_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
solar_position_model=solar_position_model,
sun_horizon_position=sun_horizon_position,
# solar_time_model=solar_time_model,
solar_incidence_model=solar_incidence_model,
horizon_profile=horizon_profile,
shading_model=shading_model,
complementary_incidence_angle=complementary_incidence_angle,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
validate_output=validate_output,
verbose=verbose,
log=log,
)
solar_incidence_series.build_output(
verbose=verbose,
fingerprint=fingerprint,
angle_output_units=angle_output_units,
)
solar_incidence_overview = {
solar_position_model.name: solar_incidence_series.output
}
results = results | solar_incidence_overview
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return results
model_solar_incidence_series ¶
model_solar_incidence_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo | None = None,
surface_orientation: SurfaceOrientation = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt = SURFACE_TILT_DEFAULT,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
solar_time_model: SolarTimeModel = milne,
solar_position_model: SolarPositionModel = noaa,
sun_horizon_position: List[
SunHorizonPositionModel
] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_model: SolarIncidenceModel = iqbal,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
complementary_incidence_angle: bool = COMPLEMENTARY_INCIDENCE_ANGLE_DEFAULT,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
) -> SolarIncidence
Source code in pvgisprototype/api/position/incidence.py
@log_function_call
@custom_cached
@validate_with_pydantic(ModelSolarIncidenceTimeSeriesInputModel)
def model_solar_incidence_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo | None = None,
surface_orientation: SurfaceOrientation = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt = SURFACE_TILT_DEFAULT,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
# refracted_solar_zenith: (
# float | None
# ) = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT, # radians
solar_time_model: SolarTimeModel = SolarTimeModel.milne,
solar_position_model: SolarPositionModel = SolarPositionModel.noaa,
sun_horizon_position: List[SunHorizonPositionModel] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_model: SolarIncidenceModel = SolarIncidenceModel.iqbal,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
complementary_incidence_angle: bool = COMPLEMENTARY_INCIDENCE_ANGLE_DEFAULT,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
) -> SolarIncidence:
""" """
logger.debug(
f"Executing solar positioning modelling function model_solar_incidence_series() for\n{timestamps}",
alt=f"Executing [underline]solar positioning modelling[/underline] function model_solar_incidence_series() for\n{timestamps}"
)
solar_incidence_series = None
surface_in_shade_series = model_surface_in_shade_series(
horizon_profile=horizon_profile,
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_time_model=solar_time_model,
solar_position_model=solar_position_model,
shading_model=shading_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
if solar_incidence_model.value == SolarIncidenceModel.jenco:
# Update-Me ----------------------------------------------------------
# Hofierka (2002) measures azimuth angles from East !
# Convert the user-defined North-based surface orientation angle to East-based
# surface_orientation_east_convention = SurfaceOrientation(
# value=convert_north_to_east_radians_convention(
# north_based_angle=surface_orientation
# ),
# unit=RADIANS,
# )
surface_orientation_south_convention = SurfaceOrientation(
value=convert_north_to_south_radians_convention(
north_based_angle=surface_orientation
),
unit=RADIANS,
)
# And apparently, defined the complementary surface tilt angle too!
# from math import pi
# surface_tilt = SurfaceTilt(
# value=(pi/2 - surface_tilt.radians),
# unit=RADIANS,
# )
# ---------------------------------------------------------- Update-Me
solar_incidence_series = calculate_solar_incidence_series_jenco(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
# surface_orientation=surface_orientation,
# surface_orientation=surface_orientation_east_convention,
surface_orientation=surface_orientation_south_convention,
surface_tilt=surface_tilt,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
surface_in_shade_series=surface_in_shade_series,
complementary_incidence_angle=complementary_incidence_angle,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_incidence_model.value == SolarIncidenceModel.iqbal:
# Iqbal (1983) measures azimuthal angles from South !
surface_orientation_south_convention = SurfaceOrientation(
value=convert_north_to_south_radians_convention(
north_based_angle=surface_orientation
),
unit=RADIANS,
)
solar_incidence_series = calculate_solar_incidence_series_iqbal(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
surface_orientation=surface_orientation_south_convention,
surface_tilt=surface_tilt,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
sun_horizon_position=sun_horizon_position,
surface_in_shade_series=surface_in_shade_series,
complementary_incidence_angle=complementary_incidence_angle,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
dtype=dtype,
array_backend=array_backend,
validate_output=validate_output,
verbose=verbose,
log=log,
)
if solar_incidence_model.value == SolarIncidenceModel.hofierka:
surface_orientation_south_convention = SurfaceOrientation(
value=convert_north_to_south_radians_convention(
north_based_angle=surface_orientation
),
unit=RADIANS,
)
solar_incidence_series = calculate_solar_incidence_series_hofierka(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
# surface_orientation=surface_orientation,
surface_orientation=surface_orientation_south_convention,
surface_tilt=surface_tilt,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
complementary_incidence_angle=complementary_incidence_angle,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_incidence_model.value == SolarIncidenceModel.pvlib:
solar_incidence_series = calculate_solar_incidence_series_pvlib(
longitude=longitude,
latitude=latitude,
surface_tilt=surface_tilt,
surface_orientation=surface_orientation,
timestamps=timestamps,
complementary_incidence_angle=complementary_incidence_angle,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
logger.debug(
f"Returning solar incidence time series :\n{solar_incidence_series}",
alt=f"Returning [yellow]solar incidence[/yellow] time series :\n{solar_incidence_series}"
)
return solar_incidence_series
models ¶
Classes:
| Name | Description |
|---|---|
SolarSurfacePositionParameter | Position parameters for a solar surface, i.e. a photovoltaic module. |
SolarSurfacePositionParameterColumnName | Column names for Position parameters for a solar surface, i.e. a photovoltaic module. |
Functions:
| Name | Description |
|---|---|
select_models | Select models from an enum list. |
validate_model | Check that one and only one model from an Enum class is selected |
SolarSurfacePositionParameter ¶
Bases: str, Enum
Position parameters for a solar surface, i.e. a photovoltaic module.
SolarSurfacePositionParameterColumnName ¶
Bases: str, Enum
Column names for Position parameters for a solar surface, i.e. a photovoltaic module.
select_models ¶
Select models from an enum list.
Source code in pvgisprototype/api/position/models.py
validate_model ¶
Check that one and only one model from an Enum class is selected
Source code in pvgisprototype/api/position/models.py
def validate_model(enum_type: Type[Enum], model: List[Enum]) -> Enum:
"""Check that one and only one model from an Enum class is selected"""
if model == enum_type.all: # or len(model) > 1: will not work! -- ReviewMe
raise typer.BadParameter(
"You can select only one model for [code]solar_time_model[/code]. Multiple or all are not meaningful nor possible."
)
return model
output ¶
Functions:
| Name | Description |
|---|---|
generate_dictionary_of_surface_in_shade_series | |
generate_dictionary_of_surface_in_shade_series_x | |
generate_dictionary_of_surface_in_shade_series ¶
generate_dictionary_of_surface_in_shade_series(
surface_in_shade_series: LocationShading,
angle_output_units,
)
Source code in pvgisprototype/api/position/output.py
@log_function_call
def generate_dictionary_of_surface_in_shade_series(
surface_in_shade_series: LocationShading,
angle_output_units,
):
""" """
return {
SolarPositionParameter.horizon: (
getattr(
surface_in_shade_series.horizon_height,
angle_output_units,
NOT_AVAILABLE,
)
if surface_in_shade_series
else NOT_AVAILABLE
),
# BEHIND_HORIZON_NAME: (
# surface_in_shade_series.value if surface_in_shade_series else NOT_AVAILABLE
# ),
SolarPositionParameter.visible: (
~surface_in_shade_series.value if surface_in_shade_series else NOT_AVAILABLE
),
SHADING_ALGORITHM_NAME: surface_in_shade_series.shading_algorithm,
}
generate_dictionary_of_surface_in_shade_series_x ¶
Source code in pvgisprototype/api/position/output.py
def generate_dictionary_of_surface_in_shade_series_x(
surface_in_shade_series: LocationShading,
):
""" """
return {
SolarPositionParameter.horizon: (
surface_in_shade_series.horizon_height
if surface_in_shade_series
else NOT_AVAILABLE
),
SolarPositionParameter.visible: (
~surface_in_shade_series.value if surface_in_shade_series else NOT_AVAILABLE
),
SHADING_ALGORITHM_NAME: surface_in_shade_series.shading_algorithm,
}
overview ¶
Functions:
| Name | Description |
|---|---|
calculate_solar_position_overview_series | Calculate an overview of solar position parameters for a time series. |
model_solar_position_overview_series | Model solar position parameters for a position and moment in time. |
calculate_solar_position_overview_series ¶
calculate_solar_position_overview_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
event: List[SolarEvent | None] = [None],
surface_orientation: SurfaceOrientation = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt = SURFACE_TILT_DEFAULT,
solar_position_models: List[SolarPositionModel] = [
noaa
],
sun_horizon_position: List[
SunHorizonPositionModel
] = SUN_HORIZON_POSITION_DEFAULT,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
complementary_incidence_angle: bool = COMPLEMENTARY_INCIDENCE_ANGLE_DEFAULT,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
solar_time_model: SolarTimeModel = noaa,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = VERBOSE_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> Dict
Calculate an overview of solar position parameters for a time series.
Calculate an overview of solar position parameters for a solar surface orientation and tilt at a given geographic position for a time series and for the user-requested solar position models (as in positioning algorithms) and one solar time model (as in solar timing algorithm).
Notes
While it is straightforward to report the solar position parameters for a series of solar position models (positioning algorithms), offering the option for multiple solar time models (timing algorithms), would mean to carefully craft the combinations for each solar time model and solar position models. Not impossible, yet something for expert users that would like to assess different combinations of algorithms to explore and assess solar position parameters.
Source code in pvgisprototype/api/position/overview.py
def calculate_solar_position_overview_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
event: List[SolarEvent | None] = [None],
surface_orientation: SurfaceOrientation = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt = SURFACE_TILT_DEFAULT,
solar_position_models: List[SolarPositionModel] = [SolarPositionModel.noaa],
sun_horizon_position: List[SunHorizonPositionModel] = SUN_HORIZON_POSITION_DEFAULT,
# solar_incidence_model: SolarIncidenceModel = SolarIncidenceModel.iqbal,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
complementary_incidence_angle: bool = COMPLEMENTARY_INCIDENCE_ANGLE_DEFAULT,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
solar_time_model: SolarTimeModel = SolarTimeModel.noaa,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = VERBOSE_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> Dict:
"""Calculate an overview of solar position parameters for a time series.
Calculate an overview of solar position parameters for a solar surface
orientation and tilt at a given geographic position for a time series and
for the user-requested solar position models (as in positioning algorithms)
and one solar time model (as in solar timing algorithm).
Notes
-----
While it is straightforward to report the solar position parameters for a
series of solar position models (positioning algorithms), offering the
option for multiple solar time models (timing algorithms), would mean to
carefully craft the combinations for each solar time model and solar
position models. Not impossible, yet something for expert users that would
like to assess different combinations of algorithms to explore and assess
solar position parameters.
"""
results = {}
for solar_position_model in solar_position_models:
# for the time being! ------------------------------------------------
if solar_position_model != SolarPositionModel.noaa:
logger.warning(
f"Solar geometry overview series is not implemented for the requested solar position model: {solar_position_model}!"
)
# --------------------------------------------------------------------
if (
solar_position_model != SolarPositionModel.all
): # ignore 'all' in the enumeration
(
solar_declination_series,
solar_hour_angle_series,
solar_zenith_series,
solar_altitude_series,
solar_azimuth_series,
surface_orientation,
surface_tilt,
solar_incidence_series,
sun_horizon_position_series, # time series of relative position
surface_in_shade_series,
solar_event_type_series,
solar_event_time_series,
) = model_solar_position_overview_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
event=event,
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
solar_time_model=solar_time_model,
solar_position_model=solar_position_model,
sun_horizon_position=sun_horizon_position, # positions for which to perform calculations !
horizon_profile=horizon_profile,
shading_model=shading_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# solar_incidence_model=solar_incidence_model,
complementary_incidence_angle=complementary_incidence_angle,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
solar_position_overview = SolarPositionOverview(
#
solar_position_model=solar_position_model,
# Positioning
solar_timing_algorithm=solar_time_model,
solar_declination=solar_declination_series,
solar_hour_angle=solar_hour_angle_series,
solar_positioning_algorithm=solar_position_model,
adjusted_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
solar_zenith=solar_zenith_series,
adjust_for_atmospheric_refraction=solar_zenith_series.adjusted_for_atmospheric_refraction,
solar_altitude=solar_altitude_series,
refracted_solar_altitude=solar_altitude_series.refracted_value,
solar_azimuth=solar_azimuth_series,
solar_azimuth_origin=solar_azimuth_series.origin,
# Incidence
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
solar_incidence=solar_incidence_series,
solar_incidence_model=solar_incidence_series.algorithm,
solar_incidence_definition=solar_incidence_series.definition,
# Sun-to-Horizon -- ** Rethink parameters naming here ! **
sun_horizon_position=sun_horizon_position_series, # time series of relative sun position !
sun_horizon_positions=sun_horizon_position, # positions for which calculations were performed !
horizon_height=surface_in_shade_series.horizon_height,
surface_in_shade=surface_in_shade_series,
shading_algorithm=shading_model,
# shading_states=shading_states,
# visible=~surface_in_shade_series.value if surface_in_shade_series else NOT_AVAILABLE,
visible=surface_in_shade_series.visible,
# Solar events
event=event,
event_type=solar_event_type_series,
event_time=solar_event_time_series,
#
angle_output_units=solar_incidence_series.unit,
)
solar_position_overview.build_output(
verbose=verbose,
fingerprint=fingerprint,
angle_output_units=angle_output_units,
)
results = {
solar_position_model.name: solar_position_overview.output
}
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return results
model_solar_position_overview_series ¶
model_solar_position_overview_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
event: List[SolarEvent | None] = [None],
surface_orientation: SurfaceOrientation = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt = SURFACE_TILT_DEFAULT,
solar_time_model: SolarTimeModel = milne,
solar_position_model: SolarPositionModel = noaa,
sun_horizon_position: List[
SunHorizonPositionModel
] = SUN_HORIZON_POSITION_DEFAULT,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
adjust_for_atmospheric_refraction: bool = True,
complementary_incidence_angle: bool = COMPLEMENTARY_INCIDENCE_ANGLE_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Tuple
Model solar position parameters for a position and moment in time.
Model essential solar position parameters for a solar surface orientation and tilt at a given geographic position for a time series based on a given solar position model (as in positioning algorithm, see class SolarPositionModel) and solar time model (as in solar timing algorithm, see class SolarTimeModel) :
- solar declination
- solar hour angle
- solar zenith
- solar altitude
- solar azimuth
- solar incidence
- sun-to-horizon position
Notes
The solar altitude angle measures from the horizon up towards the zenith (positive, and down towards the nadir (negative)). The altitude is zero all along the great circle between zenith and nadir.
The solar azimuth angle measures horizontally around the horizon from north through east, south, and west.
In order to avoid confusion, the solar incidence angle series are derived using specific functions for each "algorithm". For example, the NOAA solar positioning set of equations are "bound" to the calculate_solar_incidence_series_iqbal(). However, thinking of more flexibility, for example to facilitate a cross-comparison between different implementations of the Equation of Time and their impact on different solar incidence angle definitions, we can refactor the source code to allow for combinations of different "blocks" of solar timing and positioning algorithms.
The following combinations are currently the default ones to derive the solar incidence angle :
-
SolarPositionModel.noaa + SolarIncidenceModel.iqbal
-
SolarPositionModel.jenco + SolarIncidenceModel.jenco
-
SolarPositionModel.hofierka + SolarIncidenceModel.hofierka
Source code in pvgisprototype/api/position/overview.py
@log_function_call
@validate_with_pydantic(ModelSolarPositionOverviewSeriesInputModel)
def model_solar_position_overview_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
event: List[SolarEvent | None] = [None],
surface_orientation: SurfaceOrientation = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt = SURFACE_TILT_DEFAULT,
solar_time_model: SolarTimeModel = SolarTimeModel.milne,
solar_position_model: SolarPositionModel = SolarPositionModel.noaa,
sun_horizon_position: List[SunHorizonPositionModel] = SUN_HORIZON_POSITION_DEFAULT,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
adjust_for_atmospheric_refraction: bool = True,
# unrefracted_solar_zenith: UnrefractedSolarZenith | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
# solar_incidence_model: SolarIncidenceModel = SolarIncidenceModel.iqbal,
complementary_incidence_angle: bool = COMPLEMENTARY_INCIDENCE_ANGLE_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Tuple:
"""Model solar position parameters for a position and moment in time.
Model essential solar position parameters for a solar surface
orientation and tilt at a given geographic position for a time series based
on a given solar position model (as in positioning algorithm, see class
`SolarPositionModel`) and solar time model (as in solar timing algorithm,
see class `SolarTimeModel`) :
- solar declination
- solar hour angle
- solar zenith
- solar altitude
- solar azimuth
- solar incidence
- sun-to-horizon position
Notes
-----
The solar altitude angle measures from the horizon up towards the zenith
(positive, and down towards the nadir (negative)). The altitude is zero all
along the great circle between zenith and nadir.
The solar azimuth angle measures horizontally around the horizon from north
through east, south, and west.
In order to avoid confusion, the solar incidence angle series are derived
using specific functions for each "algorithm". For example, the NOAA solar
positioning set of equations are "bound" to the
`calculate_solar_incidence_series_iqbal()`. However, thinking of more
flexibility, for example to facilitate a cross-comparison
between different implementations of the Equation of Time and their impact
on different solar incidence angle definitions, we can refactor the source
code to allow for combinations of different "blocks" of solar timing and
positioning algorithms.
The following combinations are currently the default ones to derive the
solar incidence angle :
- SolarPositionModel.noaa + SolarIncidenceModel.iqbal
- SolarPositionModel.jenco + SolarIncidenceModel.jenco
- SolarPositionModel.hofierka + SolarIncidenceModel.hofierka
"""
solar_declination_series = None # updated if applicable
solar_hour_angle_series = None
solar_zenith_series = None # updated if applicable
solar_altitude_series = None
solar_azimuth_series = None
solar_incidence_series = None
sun_horizon_position_series = None
surface_in_shade_series = model_surface_in_shade_series(
horizon_profile=horizon_profile,
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_time_model=solar_time_model,
solar_position_model=solar_position_model,
shading_model=shading_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
solar_event_series = model_solar_event_time_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
event=event,
timezone=timezone,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
solar_event_type_series = solar_event_series.event_type
solar_event_time_series = solar_event_series.value
if solar_position_model.value == SolarPositionModel.noaa:
solar_declination_series = calculate_solar_declination_series_noaa(
timestamps=timestamps,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
solar_hour_angle_series = calculate_solar_hour_angle_series_noaa(
longitude=longitude,
timestamps=timestamps,
timezone=timezone,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
solar_zenith_series = calculate_solar_zenith_series_noaa(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
solar_altitude_series = calculate_solar_altitude_series_noaa(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
solar_azimuth_series = calculate_solar_azimuth_series_noaa(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
# Iqbal (1983) measures azimuthal angles from South !
surface_orientation_south_convention = SurfaceOrientation(
value=convert_north_to_south_radians_convention(
north_based_angle=surface_orientation
),
unit=RADIANS,
)
solar_incidence_series = calculate_solar_incidence_series_iqbal(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
surface_orientation=surface_orientation_south_convention,
surface_tilt=surface_tilt,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
sun_horizon_position=sun_horizon_position,
surface_in_shade_series=surface_in_shade_series,
complementary_incidence_angle=complementary_incidence_angle,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output
)
if solar_position_model.value == SolarPositionModel.skyfield:
pass
# solar_hour_angle, solar_declination = calculate_solar_hour_angle_declination_skyfield(
# longitude=longitude,
# latitude=latitude,
# timestamp=timestamp,
# timezone=timezone,
# )
# solar_altitude, solar_azimuth = calculate_solar_altitude_azimuth_skyfield(
# longitude=longitude,
# latitude=latitude,
# timestamp=timestamp,
# )
# solar_zenith = SolarZenith(
# value = 90 - solar_altitude.degrees,
# unit = DEGREES,
# solar_positioning_algorithm=solar_azimuth.solar_positioning_algorithm,
# solar_timing_algorithm=solar_azimuth.timing_algorithm,
# )
if solar_position_model.value == SolarPositionModel.suncalc:
pass
# # note : first azimuth, then altitude
# solar_azimuth_south_radians_convention, solar_altitude = suncalc.get_position(
# date=timestamp, # this comes first here!
# lng=longitude.degrees,
# lat=latitude.degrees,
# ).values() # zero points to south
# solar_azimuth = convert_south_to_north_radians_convention(
# solar_azimuth_south_radians_convention
# )
# solar_azimuth = SolarAzimuth(
# value=solar_azimuth,
# unit=RADIANS,
# solar_positioning_algorithm='suncalc',
# solar_timing_algorithm='suncalc',
# )
# solar_altitude = SolarAltitude(
# value=solar_altitude,
# unit=RADIANS,
# solar_positioning_algorithm='suncalc',
# solar_timing_algorithm='suncalc',
# )
# solar_zenith = SolarZenith(
# value = 90 - solar_altitude.degrees,
# unit = DEGREES,
# solar_positioning_algorithm='suncalc',
# solar_timing_algorithm='suncalc',
# )
if solar_position_model.value == SolarPositionModel.jenco:
solar_declination_series = calculate_solar_declination_series_jenco(
timestamps=timestamps,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
solar_altitude_series = calculate_solar_altitude_series_jenco(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=0,
log=log,
)
solar_azimuth_series = calculate_solar_azimuth_series_jenco(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
dtype=dtype,
array_backend=array_backend,
verbose=0,
log=log,
) # North = 0
surface_orientation_south_convention = SurfaceOrientation(
value=convert_north_to_south_radians_convention(
north_based_angle=surface_orientation
),
unit=RADIANS,
)
# And apparently, defined the complementary surface tilt angle too!
# from math import pi
# surface_tilt = SurfaceTilt(
# value=(pi/2 - surface_tilt.radians),
# unit=RADIANS,
# )
# ---------------------------------------------------------- Update-Me
solar_incidence_series = calculate_solar_incidence_series_jenco(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
# surface_orientation=surface_orientation,
# surface_orientation=surface_orientation_east_convention,
surface_orientation=surface_orientation_south_convention,
surface_tilt=surface_tilt,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
surface_in_shade_series=surface_in_shade_series,
complementary_incidence_angle=complementary_incidence_angle,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_position_model.value == SolarPositionModel.hofierka:
solar_declination_series = calculate_solar_declination_series_hofierka(
timestamps=timestamps,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
solar_hour_angle_series = calculate_solar_hour_angle_series_hofierka(
longitude=longitude,
timestamps=timestamps,
timezone=timezone,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
solar_altitude_series = calculate_solar_altitude_series_hofierka(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
solar_azimuth_series = calculate_solar_azimuth_series_hofierka(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
# adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
) # East = 0 !
surface_orientation_south_convention = SurfaceOrientation(
value=convert_north_to_south_radians_convention(
north_based_angle=surface_orientation
),
unit=RADIANS,
)
solar_incidence_series = calculate_solar_incidence_series_hofierka(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
# surface_orientation=surface_orientation,
surface_orientation=surface_orientation_south_convention,
surface_tilt=surface_tilt,
complementary_incidence_angle=complementary_incidence_angle,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_position_model.value == SolarPositionModel.pvlib:
solar_declination_series = calculate_solar_declination_series_pvlib(
timestamps=timestamps,
# dtype=dtype,
# array_backend=array_backend,
# verbose=verbose,
# log=log,
)
solar_hour_angle_series = calculate_solar_hour_angle_series_pvlib(
longitude=longitude,
timestamps=timestamps,
# timezone=timezone,
# dtype=dtype,
# array_backend=array_backend,
# verbose=verbose,
# log=log,
)
solar_zenith_series = calculate_solar_zenith_series_pvlib(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
solar_altitude_series = calculate_solar_altitude_series_pvlib(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
solar_azimuth_series = calculate_solar_azimuth_series_pvlib(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
solar_incidence_series = calculate_solar_incidence_series_pvlib(
longitude=longitude,
latitude=latitude,
surface_tilt=surface_tilt,
surface_orientation=surface_orientation,
timestamps=timestamps,
complementary_incidence_angle=complementary_incidence_angle,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_incidence_series.sun_horizon_position is not None:
sun_horizon_position_series = solar_incidence_series.sun_horizon_position
else:
sun_horizon_position = NOT_AVAILABLE
if shading_model:
surface_in_shade_series = model_surface_in_shade_series(
horizon_profile=horizon_profile,
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_time_model=solar_time_model,
solar_position_model=solar_position_model,
shading_model=shading_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
position_series = (
solar_declination_series if solar_declination_series is not None else None,
solar_hour_angle_series if solar_hour_angle_series is not None else None,
solar_zenith_series if solar_zenith_series is not None else None,
solar_altitude_series if solar_altitude_series is not None else None,
solar_azimuth_series if solar_azimuth_series is not None else None,
surface_orientation if surface_orientation is not None else None,
surface_tilt if surface_tilt is not None else None,
solar_incidence_series if solar_incidence_series is not None else None,
sun_horizon_position_series if sun_horizon_position_series is not None else None,
surface_in_shade_series if surface_in_shade_series is not None else None,
solar_event_type_series if solar_event_series.event_type is not None else None,
solar_event_time_series if solar_event_series.value is not None else None,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return position_series
shading ¶
Functions:
| Name | Description |
|---|---|
calculate_surface_in_shade_series | Calculates location shade using the requested models and returns the |
model_surface_in_shade_series | |
calculate_surface_in_shade_series ¶
calculate_surface_in_shade_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo | None,
horizon_profile: DataArray | None,
shading_models: List[ShadingModel] = [pvgis],
solar_time_model: SolarTimeModel = noaa,
solar_position_model: SolarPositionModel = noaa,
adjust_for_atmospheric_refraction: bool = True,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Dict
Calculates location shade using the requested models and returns the results in a dictionary.
Source code in pvgisprototype/api/position/shading.py
@log_function_call
@validate_with_pydantic(CalculateSurfaceInShadeSeriesInputModel)
def calculate_surface_in_shade_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo | None,
horizon_profile: DataArray | None,
shading_models: List[ShadingModel] = [ShadingModel.pvgis],
solar_time_model: SolarTimeModel = SolarTimeModel.noaa,
solar_position_model: SolarPositionModel = SolarPositionModel.noaa,
adjust_for_atmospheric_refraction: bool = True,
# unrefracted_solar_zenith: UnrefractedSolarZenith | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Dict:
"""Calculates location shade using the requested models and returns the
results in a dictionary.
"""
results = {}
for shading_model in shading_models:
if shading_model != ShadingModel.all: # ignore 'all' in the enumeration
surface_in_shade_series = model_surface_in_shade_series(
horizon_profile=horizon_profile,
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_time_model=solar_time_model,
solar_position_model=solar_position_model,
shading_model=shading_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
surface_in_shade_series.build_output(
verbose=verbose,
fingerprint=fingerprint,
angle_output_units=angle_output_units,
)
surface_in_shade_overview = {
solar_position_model.name: surface_in_shade_series.output,
}
results = results | surface_in_shade_overview
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return results
model_surface_in_shade_series ¶
model_surface_in_shade_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex | Timestamp | None,
timezone: ZoneInfo | None,
horizon_profile: DataArray | None,
solar_time_model: SolarTimeModel = noaa,
solar_position_model: SolarPositionModel = noaa,
shading_model: ShadingModel = pvgis,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
) -> LocationShading
Source code in pvgisprototype/api/position/shading.py
@log_function_call
@custom_cached
@validate_with_pydantic(ModelSurfaceInShadeSeriesInputModel)
def model_surface_in_shade_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex | Timestamp | None,
timezone: ZoneInfo | None,
horizon_profile: DataArray | None,
solar_time_model: SolarTimeModel = SolarTimeModel.noaa,
solar_position_model: SolarPositionModel = SolarPositionModel.noaa,
shading_model: ShadingModel = ShadingModel.pvgis,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
# unrefracted_solar_zenith: UnrefractedSolarZenith | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
) -> LocationShading:
""" """
logger.debug(
f"Executing shading modelling function model_shade_series() for\n{timestamps}",
alt=f"Executing [underline]shading modelling[/underline] function model_shade_series() for\n{timestamps}"
)
surface_in_shade_series = None
solar_altitude_series = model_solar_altitude_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
# solar_time_model=solar_time_model,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
solar_azimuth_series = model_solar_azimuth_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
solar_time_model=solar_time_model,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=0,
log=log,
validate_output=validate_output,
)
if shading_model.value == ShadingModel.pvlib:
pass
if shading_model.value == ShadingModel.pvgis:
surface_in_shade_series = calculate_surface_in_shade_series_pvgis(
solar_altitude_series=solar_altitude_series,
solar_azimuth_series=solar_azimuth_series,
horizon_profile=horizon_profile,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
logger.debug(
f"Returning surface in shade time series :\n{surface_in_shade_series}",
alt=f"Returning [gray]surface in shade[/gray] time series :\n{surface_in_shade_series}"
)
return surface_in_shade_series
solar_time ¶
Functions:
| Name | Description |
|---|---|
calculate_solar_time_series | Calculates the solar time using all models and returns the results in a table. |
model_solar_time_series | Calculates the solar time using the requested algorithm. |
calculate_solar_time_series ¶
calculate_solar_time_series(
longitude: Longitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_time_models: List[SolarTimeModel] = [noaa],
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = VERBOSE_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Dict
Calculates the solar time using all models and returns the results in a table.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
Returns | | required |
Source code in pvgisprototype/api/position/solar_time.py
def calculate_solar_time_series(
longitude: Longitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_time_models: List[SolarTimeModel] = [SolarTimeModel.noaa],
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = VERBOSE_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Dict:
"""Calculates the solar time using all models and returns the results in a table.
Parameters
----------
Returns
-------
"""
results = {}
# solar_time_models = select_models(SolarTimeModel, solar_time_model) # Using a callback fails!
for solar_time_model in solar_time_models:
if (
solar_time_model != SolarTimeModel.all
): # ignore 'all' in the enumeration
solar_time_series = model_solar_time_series(
longitude=longitude,
timestamps=timestamps,
timezone=timezone,
solar_time_model=solar_time_model,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
solar_time_model_overview = {
solar_time_model.name: {
TIME_ALGORITHM_NAME: (
solar_time_model.value if solar_time_model else NOT_AVAILABLE
),
SOLAR_TIME_NAME: (
solar_time_series if solar_time_series else NOT_AVAILABLE
),
UNIT_NAME: MINUTES,
}
}
results = results | solar_time_model_overview
return results
model_solar_time_series ¶
model_solar_time_series(
longitude: Longitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_time_model: SolarTimeModel = noaa,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
)
Calculates the solar time using the requested algorithm.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
input | SolarTimeInput | | required |
Returns:
| Type | Description |
|---|---|
SolarTime | |
Source code in pvgisprototype/api/position/solar_time.py
@log_function_call
@validate_with_pydantic(ModelSolarTimeTimeSeriesInputModel)
def model_solar_time_series(
longitude: Longitude,
# latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_time_model: SolarTimeModel = SolarTimeModel.noaa,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
):
"""Calculates the solar time using the requested _algorithm_.
Parameters
----------
input : SolarTimeInput
Returns
-------
SolarTime
"""
solar_time_series = None
if solar_time_model.value == SolarTimeModel.milne:
pass
solar_time_series = calculate_apparent_solar_time_series_milne1921(
longitude=longitude,
timestamps=timestamps,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
if solar_time_model.value == SolarTimeModel.pvgis:
# Requires : time_offset_global, hour_offset
pass
if solar_time_model.value == SolarTimeModel.noaa:
solar_time_series = calculate_true_solar_time_series_noaa(
longitude=longitude,
timestamps=timestamps,
timezone=timezone,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
if solar_time_model.value == SolarTimeModel.skyfield:
pass
# # vvv vvv vvv --------------------------------------- expects degrees!
# solar_time = calculate_solar_time_skyfield(
# longitude=longitude,
# latitude=latitude,
# timestamp=timestamp,
# timezone=timezone,
# verbose=verbose,
# )
# # ^^^ ^^^ ^^^ --------------------------------------- expects degrees!
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return solar_time_series
zenith ¶
Functions:
| Name | Description |
|---|---|
calculate_solar_zenith_series | Calculates the solar position using the requested models and returns the |
model_solar_zenith_series | Notes |
calculate_solar_zenith_series ¶
calculate_solar_zenith_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_position_models: List[SolarPositionModel] = [
noaa
],
adjust_for_atmospheric_refraction: bool = True,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Dict
Calculates the solar position using the requested models and returns the results in a dictionary.
Source code in pvgisprototype/api/position/zenith.py
@log_function_call
def calculate_solar_zenith_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_position_models: List[SolarPositionModel] = [SolarPositionModel.noaa],
# solar_time_model: SolarTimeModel = SolarTimeModel.noaa,
adjust_for_atmospheric_refraction: bool = True,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
angle_output_units: str = RADIANS,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> Dict:
"""Calculates the solar position using the requested models and returns the
results in a dictionary.
"""
results = {}
for solar_position_model in solar_position_models:
if (
solar_position_model != SolarPositionModel.all
): # ignore 'all' in the enumeration
solar_zenith_series = model_solar_zenith_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
solar_zenith_series.build_output(
verbose=verbose,
fingerprint=fingerprint,
angle_output_units=angle_output_units,
)
solar_zenith_overview = {
solar_position_model.name: solar_zenith_series.output
}
results = results | solar_zenith_overview
return results
model_solar_zenith_series ¶
model_solar_zenith_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_position_model: SolarPositionModel = noaa,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> SolarZenith
Notes
The solar altitude angle measures from the horizon up towards the zenith (positive, and down towards the nadir (negative)). The altitude is zero all along the great circle between zenith and nadir.
- All solar calculation functions return floating angular measurements in radians.
Source code in pvgisprototype/api/position/zenith.py
@log_function_call
@custom_cached
@validate_with_pydantic(ModelSolarAltitudeTimeSeriesInputModel)
def model_solar_zenith_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
solar_position_model: SolarPositionModel = SolarPositionModel.noaa,
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
) -> SolarZenith:
"""
Notes
-----
The solar altitude angle measures from the horizon up towards the zenith
(positive, and down towards the nadir (negative)). The altitude is zero all
along the great circle between zenith and nadir.
- All solar calculation functions return floating angular measurements in
radians.
"""
solar_zenith_series = None
solar_altitude_series = None
if solar_position_model.value == SolarPositionModel.noaa:
solar_zenith_series = calculate_solar_zenith_series_noaa(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
if solar_position_model.value == SolarPositionModel.skyfield:
pass
# if solar_position_model.value == SolarPositionModel.skyfield:
# solar_zenith, solar_azimuth = calculate_solar_zenith_azimuth_skyfield(
# longitude=longitude,
# latitude=latitude,
# timestamp=timestamp,
# )
if solar_position_model.value == SolarPositionModel.suncalc:
pass
# if solar_position_model.value == SolarPositionModel.suncalc:
# # note : first azimuth, then altitude
# solar_azimuth_south_radians_convention, solar_zenith = suncalc.get_position(
# date=timestamp, # this comes first here!
# lng=longitude.degrees,
# lat=latitude.degrees,
# ).values() # zero points to south
# solar_zenith = SolarAltitude(
# value=solar_zenith,
# unit=RADIANS,
# solar_positioning_algorithm='suncalc',
# solar_timing_algorithm='suncalc',
# )
# if (
# not isfinite(solar_zenith.degrees)
# or not solar_zenith.min_degrees <= solar_zenith.degrees <= solar_zenith.max_degrees
# ):
# raise ValueError(
# f"The calculated solar altitude angle {solar_zenith.degrees} is out of the expected range\
# [{solar_zenith.min_degrees}, {solar_zenith.max_degrees}] degrees"
# )
if solar_position_model.value == SolarPositionModel.jenco:
solar_altitude_series = calculate_solar_altitude_series_jenco(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_position_model.value == SolarPositionModel.hofierka:
solar_altitude_series = calculate_solar_altitude_series_hofierka(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if solar_position_model.value == SolarPositionModel.pvlib:
pass
# if solar_position_model.value == SolarPositionModel.pvlib:
# solar_zenith = calculate_solar_zenith_pvlib(
# longitude=longitude,
# latitude=latitude,
# timestamp=timestamp,
# timezone=timezone,
# verbose=verbose,
# )
if isinstance(solar_altitude_series, SolarAltitude):
solar_zenith_series = SolarZenith(
value=solar_altitude_series.radians - (np.pi / 2),
unit=RADIANS,
solar_positioning_algorithm=solar_altitude_series.solar_positioning_algorithm,
timing_algorithm=solar_altitude_series.solar_timing_algorithm,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return solar_zenith_series
power ¶
Modules:
| Name | Description |
|---|---|
broadband | |
broadband_multiple_surfaces | |
broadband_rear_side | |
efficiency | |
temperature | |
broadband ¶
Functions:
| Name | Description |
|---|---|
calculate_photovoltaic_power_output_series | Estimate the photovoltaic power over a time series or an arbitrarily |
calculate_photovoltaic_power_output_series ¶
calculate_photovoltaic_power_output_series(
longitude: Longitude,
latitude: Latitude,
elevation: Elevation,
surface_orientation: (
SurfaceOrientation | None
) = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt | None = SURFACE_TILT_DEFAULT,
timestamps: DatetimeIndex = DatetimeIndex(
[now(tz="UTC")]
),
timezone: ZoneInfo | None = ZoneInfo("UTC"),
global_horizontal_irradiance: ndarray | None = None,
direct_horizontal_irradiance: ndarray | None = None,
spectral_factor_series: SpectralFactorSeries = SpectralFactorSeries(
value=SPECTRAL_FACTOR_DEFAULT
),
temperature_series: ndarray = array(
TEMPERATURE_DEFAULT
),
wind_speed_series: ndarray = array(WIND_SPEED_DEFAULT),
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
albedo: float | None = ALBEDO_DEFAULT,
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
sun_horizon_position: List[
SunHorizonPositionModel
] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_model: SolarIncidenceModel = iqbal,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
shading_states: List[ShadingState] = [all],
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
photovoltaic_module_type: PhotovoltaicModuleType = Monofacial,
bifaciality_factor: float = 0.3,
photovoltaic_module: PhotovoltaicModuleModel = CSI_FREE_STANDING,
peak_power: float = PEAK_POWER_DEFAULT,
system_efficiency: (
float | None
) = SYSTEM_EFFICIENCY_DEFAULT,
power_model: PhotovoltaicModulePerformanceModel = king,
radiation_cutoff_threshold: float = RADIATION_CUTOFF_THRESHHOLD,
temperature_model: ModuleTemperatureAlgorithm = faiman,
efficiency: float | None = EFFICIENCY_FACTOR_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
angle_output_units: str = RADIANS,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
profile: bool = cPROFILE_FLAG_DEFAULT,
)
Estimate the photovoltaic power over a time series or an arbitrarily aggregated energy production of a PV system based on the effective solar irradiance incident on a solar surface, the ambient temperature and optionally wind speed.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
longitude | float | The longitude of the location for which the energy production is calculated. | required |
latitude | float | The latitude of the location. | required |
elevation | float | Elevation of the location in meters. | required |
timestamps | DatetimeIndex | Specific timestamps for which to calculate the irradiance. Default is None. | DatetimeIndex([now(tz='UTC')]) |
timezone | str | None | Timezone of the location. Default is None. | ZoneInfo('UTC') |
global_horizontal_component | Path | None | Path to data file for global horizontal irradiance. Default is None. | required |
direct_horizontal_component | Path | None | Path to data file for direct horizontal irradiance. Default is None. | required |
temperature_series | TemperatureSeries | Series of temperature values. Default is TEMPERATURE_DEFAULT. | array(TEMPERATURE_DEFAULT) |
wind_speed_series | WindSpeedSeries | Series of wind speed values. Default is WIND_SPEED_DEFAULT. | array(WIND_SPEED_DEFAULT) |
mask_and_scale | bool | If True, applies masking and scaling to the input data. | False |
Returns:
| Name | Type | Description |
|---|---|---|
photovoltaic_power_output_series | ndarray | Array of effective irradiance values. |
results | dict | Dictionary containing detailed results of the calculation. |
title | str | Title of the output data. |
Examples:
>>> calculate_photovoltaic_power_output_series(10.0, 20.0, 100.0)
# This will return the effective irradiance series, results, and title for the specified parameters.
Notes
This function is part of the Typer-based CLI for the new PVGIS implementation in Python. It provides an interface for estimating the energy production of a photovoltaic system, taking into account various environmental and system parameters.
Source code in pvgisprototype/api/power/broadband.py
@log_function_call
def calculate_photovoltaic_power_output_series(
longitude: Longitude,
latitude: Latitude,
elevation: Elevation,
surface_orientation: SurfaceOrientation | None = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt | None = SURFACE_TILT_DEFAULT,
#
timestamps: DatetimeIndex = DatetimeIndex([Timestamp.now(tz='UTC')]),
timezone: ZoneInfo | None = ZoneInfo("UTC"),
#
global_horizontal_irradiance: ndarray | None = None,
direct_horizontal_irradiance: ndarray | None = None,
spectral_factor_series: SpectralFactorSeries = SpectralFactorSeries(
value=SPECTRAL_FACTOR_DEFAULT
),
temperature_series: numpy.ndarray = numpy.array(TEMPERATURE_DEFAULT),
wind_speed_series: numpy.ndarray = numpy.array(WIND_SPEED_DEFAULT),
#
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
# unrefracted_solar_zenith: UnrefractedSolarZenith | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
albedo: float | None = ALBEDO_DEFAULT,
#
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
sun_horizon_position: List[SunHorizonPositionModel] = SUN_HORIZON_POSITION_DEFAULT,
#
solar_incidence_model: SolarIncidenceModel = SolarIncidenceModel.iqbal,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
#
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
shading_states: List[ShadingState] = [ShadingState.all],
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
#
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
#
photovoltaic_module_type: PhotovoltaicModuleType = PhotovoltaicModuleType.Monofacial, # Leave Me Like This !
bifaciality_factor: float = 0.3, # 0.7, # Fixed !
photovoltaic_module: PhotovoltaicModuleModel = PhotovoltaicModuleModel.CSI_FREE_STANDING,
peak_power: float = PEAK_POWER_DEFAULT,
#
system_efficiency: float | None = SYSTEM_EFFICIENCY_DEFAULT,
power_model: PhotovoltaicModulePerformanceModel = PhotovoltaicModulePerformanceModel.king,
radiation_cutoff_threshold: float = RADIATION_CUTOFF_THRESHHOLD,
temperature_model: ModuleTemperatureAlgorithm = ModuleTemperatureAlgorithm.faiman,
efficiency: float | None = EFFICIENCY_FACTOR_DEFAULT,
#
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
#
angle_output_units: str = RADIANS,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
profile: bool = cPROFILE_FLAG_DEFAULT,
):
"""
Estimate the photovoltaic power over a time series or an arbitrarily
aggregated energy production of a PV system based on the effective solar
irradiance incident on a solar surface, the ambient temperature and
optionally wind speed.
Parameters
----------
longitude : float
The longitude of the location for which the energy production is calculated.
latitude : float
The latitude of the location.
elevation : float
Elevation of the location in meters.
timestamps : DatetimeIndex, optional
Specific timestamps for which to calculate the irradiance. Default is None.
timezone : str | None, optional
Timezone of the location. Default is None.
global_horizontal_component : Path | None, optional
Path to data file for global horizontal irradiance. Default is None.
direct_horizontal_component : Path | None, optional
Path to data file for direct horizontal irradiance. Default is None.
temperature_series : TemperatureSeries
Series of temperature values. Default is TEMPERATURE_DEFAULT.
wind_speed_series : WindSpeedSeries
Series of wind speed values. Default is WIND_SPEED_DEFAULT.
mask_and_scale : bool, default False
If True, applies masking and scaling to the input data.
# ... other parameters ...
Returns
-------
photovoltaic_power_output_series : ndarray
Array of effective irradiance values.
results : dict
Dictionary containing detailed results of the calculation.
title : str
Title of the output data.
Examples
--------
>>> calculate_photovoltaic_power_output_series(10.0, 20.0, 100.0)
# This will return the effective irradiance series, results, and title for the specified parameters.
Notes
-----
This function is part of the Typer-based CLI for the new PVGIS
implementation in Python. It provides an interface for estimating the
energy production of a photovoltaic system, taking into account various
environmental and system parameters.
"""
if profile:
import cProfile
pr = cProfile.Profile()
pr.enable()
# In-Plane Irradiance After Reflectivity Loss
# [ also referred to as inlined irradiance ]
global_inclined_irradiance_series = calculate_global_inclined_irradiance(
longitude=longitude,
latitude=latitude,
elevation=elevation,
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
timestamps=timestamps,
timezone=timezone,
global_horizontal_irradiance=global_horizontal_irradiance, # time series optional
direct_horizontal_irradiance=direct_horizontal_irradiance, # time series, optional
linke_turbidity_factor_series=linke_turbidity_factor_series,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
apply_reflectivity_factor=apply_reflectivity_factor,
solar_position_model=solar_position_model,
sun_horizon_position=sun_horizon_position,
solar_incidence_model=solar_incidence_model,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
horizon_profile=horizon_profile,
shading_model=shading_model,
shading_states=shading_states,
solar_time_model=solar_time_model,
solar_constant=solar_constant,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
# angle_output_units=angle_output_units,
dtype=dtype,
array_backend=array_backend,
validate_output=validate_output,
verbose=verbose,
log=log,
fingerprint=fingerprint,
)
rear_side_global_inclined_irradiance_series = None # to avoid the "unbound error"
if photovoltaic_module_type == PhotovoltaicModuleType.Bifacial:
# Redesign Me : Maybe rethink the logic to get the rear side angles ?
from math import pi
rear_side_surface_orientation = pi - surface_orientation
rear_side_surface_tilt = pi - surface_tilt
# --------------------------------------------------------------------
rear_side_global_inclined_irradiance_series = calculate_global_inclined_irradiance(
longitude=longitude,
latitude=latitude,
elevation=elevation,
surface_orientation=rear_side_surface_orientation, # Critical !
surface_tilt=rear_side_surface_tilt, # Critical !
timestamps=timestamps,
timezone=timezone,
global_horizontal_irradiance=global_horizontal_irradiance, # time series optional
direct_horizontal_irradiance=direct_horizontal_irradiance, # time series, optional
linke_turbidity_factor_series=linke_turbidity_factor_series,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
apply_reflectivity_factor=apply_reflectivity_factor,
solar_position_model=solar_position_model,
sun_horizon_position=sun_horizon_position,
solar_incidence_model=solar_incidence_model,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
horizon_profile=horizon_profile,
shading_model=shading_model,
shading_states=shading_states,
solar_time_model=solar_time_model,
solar_constant=solar_constant,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
# angle_output_units=angle_output_units,
dtype=dtype,
array_backend=array_backend,
validate_output=validate_output,
verbose=verbose,
log=log,
fingerprint=fingerprint,
)
logger.info(
"i [bold]Applying[/bold] [magenta]the bifacility factor[/magenta] on the read-side (?) global inclined irradiance .."
)
if bifaciality_factor:
rear_side_global_inclined_irradiance_series.value *= bifaciality_factor
# -------------------------------------------------------- Redesign Me ---
global_inclined_irradiance_series.value += (
rear_side_global_inclined_irradiance_series.value
)
if not power_model:
if not efficiency: # user-set -- RenameMe ? FIXME
efficiency_factor_series = system_efficiency
else:
efficiency_factor_series = efficiency
else:
if efficiency:
efficiency_factor_series = efficiency
else:
efficiency_series = calculate_photovoltaic_efficiency_series(
irradiance_series=global_inclined_irradiance_series.value,
photovoltaic_module=photovoltaic_module,
photovoltaic_module_type=photovoltaic_module_type,
bifaciality_factor=bifaciality_factor,
power_model=power_model,
temperature_model=temperature_model,
# model_constants=EFFICIENCY_MODEL_COEFFICIENTS_DEFAULT,
spectral_factor_series=spectral_factor_series, # required for the Power model !
temperature_series=temperature_series,
standard_test_temperature=TEMPERATURE_DEFAULT,
wind_speed_series=wind_speed_series,
radiation_cutoff_threshold=radiation_cutoff_threshold,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
efficiency_factor_series = efficiency_series.value
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
"i [bold]Applying[/bold] [magenta]efficiency coefficients[/magenta] on the global inclined irradiance .."
)
# Power Model efficiency coefficients include temperature and low irradiance effect !
photovoltaic_power_output_without_system_loss_series = (
global_inclined_irradiance_series.value * efficiency_factor_series
) # Safer to deepcopy the efficiency_series which are modified _afer_ this point ?
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
"i [bold]Applying[/bold] [magenta]system loss[/magenta] on the effective photovoltaic power .."
)
photovoltaic_power_output_series = (
photovoltaic_power_output_without_system_loss_series * system_efficiency
)
out_of_range, out_of_range_index = identify_values_out_of_range(
series=photovoltaic_power_output_series,
shape=timestamps.shape,
data_model=PhotovoltaicPowerFromExternalData(),
)
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug("i [bold]Building the output[/bold] ..")
if isinstance(
global_inclined_irradiance_series.direct_horizontal_irradiance,
DirectHorizontalIrradianceFromExternalData,
):
photovoltaic_power = PhotovoltaicPowerFromExternalData(
value=photovoltaic_power_output_series,
out_of_range=out_of_range,
out_of_range_index=out_of_range_index,
photovoltaic_power_without_system_loss=photovoltaic_power_output_without_system_loss_series,
#
photovoltaic_module_type=photovoltaic_module_type,
technology=photovoltaic_module.value,
power_model=power_model.value,
system_efficiency=system_efficiency,
efficiency_factor=efficiency_factor_series,
#
temperature=temperature_series,
wind_speed=wind_speed_series,
#
effective_global_irradiance=global_inclined_irradiance_series.value
* efficiency_factor_series,
effective_direct_irradiance=global_inclined_irradiance_series.direct_inclined_irradiance
* efficiency_factor_series,
effective_diffuse_irradiance=global_inclined_irradiance_series.diffuse_inclined_irradiance
* efficiency_factor_series,
effective_ground_reflected_irradiance=global_inclined_irradiance_series.ground_reflected_inclined_irradiance
* efficiency_factor_series,
spectral_effect=efficiency_series.effective_irradiance.spectral_effect,
spectral_effect_percentage=efficiency_series.effective_irradiance.spectral_effect_percentage,
spectral_factor=spectral_factor_series,
#
peak_power=peak_power,
## Inclined Irradiance Components
global_inclined_irradiance=global_inclined_irradiance_series.value,
direct_inclined_irradiance=global_inclined_irradiance_series.direct_inclined_irradiance,
diffuse_inclined_irradiance=global_inclined_irradiance_series.diffuse_inclined_irradiance,
ground_reflected_inclined_irradiance=global_inclined_irradiance_series.ground_reflected_inclined_irradiance,
#
rear_side_global_inclined_irradiance_series=rear_side_global_inclined_irradiance_series.value if rear_side_global_inclined_irradiance_series else None,
## Loss due to Reflectivity
global_inclined_reflected=global_inclined_irradiance_series.reflected,
direct_inclined_reflected=global_inclined_irradiance_series.direct_inclined_reflected,
diffuse_inclined_reflected=global_inclined_irradiance_series.diffuse_inclined_reflected,
ground_reflected_inclined_reflected=global_inclined_irradiance_series.ground_reflected_inclined_reflected,
#
## Reflectivity Factor for Irradiance Components
direct_inclined_reflectivity_factor=global_inclined_irradiance_series.direct_inclined_reflectivity_factor,
diffuse_inclined_reflectivity_factor=global_inclined_irradiance_series.diffuse_inclined_reflectivity_factor,
ground_reflected_inclined_reflectivity_factor=global_inclined_irradiance_series.ground_reflected_inclined_reflectivity_factor,
#
## Reflectivity Coefficient which defines the Reflectivity Factor for Irradiance Components
# direct_inclined_reflectivity_coefficient=direct_inclined_reflectivity_coefficient_series,
diffuse_inclined_reflectivity_coefficient=global_inclined_irradiance_series.diffuse_inclined_reflectivity_coefficient,
# ground_reflected_inclined_reflectivity_coefficient=ground_reflected_inclined_reflectivity_coefficient_series,
#
## Inclined Irradiance before loss due to Reflectivity
global_inclined_before_reflectivity=global_inclined_irradiance_series.value_before_reflectivity,
direct_inclined_before_reflectivity=global_inclined_irradiance_series.direct_inclined_before_reflectivity,
diffuse_inclined_before_reflectivity=global_inclined_irradiance_series.diffuse_inclined_before_reflectivity,
ground_reflected_inclined_before_reflectivity=global_inclined_irradiance_series.ground_reflected_inclined_before_reflectivity,
#
## Horizontal Irradiance Components
global_horizontal_irradiance=global_horizontal_irradiance,
direct_horizontal_irradiance=global_inclined_irradiance_series.direct_horizontal_irradiance,
diffuse_horizontal_irradiance=global_inclined_irradiance_series.diffuse_horizontal_irradiance,
#
## Components of the Extraterrestrial irradiance
extraterrestrial_horizontal_irradiance=global_inclined_irradiance_series.extraterrestrial_horizontal_irradiance,
extraterrestrial_normal_irradiance=global_inclined_irradiance_series.extraterrestrial_normal_irradiance,
linke_turbidity_factor=linke_turbidity_factor_series,
#
## Location and Position
elevation=elevation,
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
sun_horizon_positions=global_inclined_irradiance_series.sun_horizon_positions, # states != sun_horizon_position
#
## Solar Position parameters
horizon_height=global_inclined_irradiance_series.surface_in_shade.horizon_height,
surface_in_shade=global_inclined_irradiance_series.surface_in_shade,
visible=global_inclined_irradiance_series.surface_in_shade.visible,
solar_incidence=global_inclined_irradiance_series.solar_incidence,
shading_state=global_inclined_irradiance_series.shading_state,
sun_horizon_position=global_inclined_irradiance_series.sun_horizon_position, # position != sun_horizon_positions
solar_altitude=global_inclined_irradiance_series.solar_altitude,
refracted_solar_altitude=global_inclined_irradiance_series.refracted_solar_altitude,
solar_azimuth=global_inclined_irradiance_series.solar_azimuth,
solar_azimuth_origin=global_inclined_irradiance_series.solar_azimuth.origin,
# azimuth_difference=azimuth_difference_series,
#
## Positioning, Timing and Atmospheric algorithms
angle_output_units=global_inclined_irradiance_series.solar_incidence.unit, # Maybe get from surface_[prientation|tilt] ?
solar_positioning_algorithm=global_inclined_irradiance_series.solar_positioning_algorithm,
solar_timing_algorithm=global_inclined_irradiance_series.solar_timing_algorithm,
adjusted_for_atmospheric_refraction=global_inclined_irradiance_series.adjusted_for_atmospheric_refraction,
solar_incidence_model=global_inclined_irradiance_series.solar_incidence_model,
solar_incidence_definition=global_inclined_irradiance_series.solar_incidence.definition,
# SOLAR_CONSTANT_COLUMN_NAME: solar_constant,
# ECCENTRICITY_PHASE_OFFSET_COLUMN_NAME: eccentricity_phase_offset,
# ECCENTRICITY_CORRECTION_FACTOR_COLUMN_NAME: eccentricity_amplitude,
shading_algorithm=global_inclined_irradiance_series.shading_algorithm,
shading_states=shading_states,
#
## Sources
)
else:
photovoltaic_power = PhotovoltaicPower(
value=photovoltaic_power_output_series,
out_of_range=out_of_range,
out_of_range_index=out_of_range_index,
photovoltaic_power_without_system_loss=photovoltaic_power_output_without_system_loss_series,
photovoltaic_module_type=photovoltaic_module_type,
technology=photovoltaic_module.value,
power_model=power_model.value,
system_efficiency=system_efficiency,
efficiency_factor=efficiency_factor_series,
temperature=temperature_series,
wind_speed=wind_speed_series,
#
## Effective Irradiance Components
effective_global_irradiance=global_inclined_irradiance_series.value * efficiency_factor_series,
effective_direct_irradiance=global_inclined_irradiance_series.direct_inclined_irradiance * efficiency_factor_series,
effective_diffuse_irradiance=global_inclined_irradiance_series.diffuse_inclined_irradiance * efficiency_factor_series,
effective_ground_reflected_irradiance=global_inclined_irradiance_series.ground_reflected_inclined_irradiance * efficiency_factor_series,
spectral_effect=efficiency_series.effective_irradiance.spectral_effect,
spectral_effect_percentage=efficiency_series.effective_irradiance.spectral_effect_percentage,
spectral_factor=spectral_factor_series,
peak_power=peak_power,
#
## Inclined Irradiance Components
global_inclined_irradiance=global_inclined_irradiance_series.value,
direct_inclined_irradiance=global_inclined_irradiance_series.direct_inclined_irradiance,
diffuse_inclined_irradiance=global_inclined_irradiance_series.diffuse_inclined_irradiance,
ground_reflected_inclined_irradiance=global_inclined_irradiance_series.ground_reflected_inclined_irradiance,
#
## Loss due to Reflectivity
global_inclined_reflected=global_inclined_irradiance_series.reflected,
direct_inclined_reflected=global_inclined_irradiance_series.direct_inclined_reflected,
diffuse_inclined_reflected=global_inclined_irradiance_series.diffuse_inclined_reflected,
ground_reflected_inclined_reflected=global_inclined_irradiance_series.ground_reflected_inclined_reflected,
#
## Reflectivity Factor for Irradiance Components
direct_inclined_reflectivity_factor=global_inclined_irradiance_series.direct_inclined_reflectivity_factor,
diffuse_inclined_reflectivity_factor=global_inclined_irradiance_series.diffuse_inclined_reflectivity_factor,
ground_reflected_inclined_reflectivity_factor=global_inclined_irradiance_series.ground_reflected_inclined_reflectivity_factor,
#
## Reflectivity Coefficient which defines the Reflectivity Factor for Irradiance Components
# direct_inclined_reflectivity_coefficient=direct_inclined_reflectivity_coefficient_series,
diffuse_inclined_reflectivity_coefficient=global_inclined_irradiance_series.diffuse_inclined_reflectivity_coefficient,
# ground_reflected_inclined_reflectivity_coefficient=ground_reflected_inclined_reflectivity_coefficient_series,
#
## Inclined Irradiance before loss due to Reflectivity
global_inclined_before_reflectivity=global_inclined_irradiance_series.value_before_reflectivity,
direct_inclined_before_reflectivity=global_inclined_irradiance_series.direct_inclined_before_reflectivity,
diffuse_inclined_before_reflectivity=global_inclined_irradiance_series.diffuse_inclined_before_reflectivity,
ground_reflected_inclined_before_reflectivity=global_inclined_irradiance_series.ground_reflected_inclined_before_reflectivity,
#
## Horizontal Irradiance Components
global_horizontal_irradiance=global_horizontal_irradiance,
direct_horizontal_irradiance=global_inclined_irradiance_series.direct_horizontal_irradiance,
diffuse_horizontal_irradiance=global_inclined_irradiance_series.diffuse_horizontal_irradiance,
#
## Components of the Extraterrestrial irradiance
extraterrestrial_horizontal_irradiance=global_inclined_irradiance_series.extraterrestrial_horizontal_irradiance,
extraterrestrial_normal_irradiance=global_inclined_irradiance_series.extraterrestrial_normal_irradiance,
linke_turbidity_factor=linke_turbidity_factor_series,
#
## Location and Position
elevation=elevation,
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
sun_horizon_positions=global_inclined_irradiance_series.sun_horizon_positions, # states != sun_horizon_position
#
## Solar Position parameters
horizon_height=global_inclined_irradiance_series.surface_in_shade.horizon_height,
surface_in_shade=global_inclined_irradiance_series.surface_in_shade,
visible=global_inclined_irradiance_series.surface_in_shade.visible,
solar_incidence=global_inclined_irradiance_series.solar_incidence,
shading_state=global_inclined_irradiance_series.shading_state,
sun_horizon_position=global_inclined_irradiance_series.sun_horizon_position, # positions != sun_horizon_positions
solar_altitude=global_inclined_irradiance_series.solar_altitude,
refracted_solar_altitude=global_inclined_irradiance_series.refracted_solar_altitude,
solar_azimuth=global_inclined_irradiance_series.solar_azimuth,
solar_azimuth_origin=global_inclined_irradiance_series.solar_azimuth.origin,
# azimuth_difference=azimuth_difference_series,
#
## Positioning, Timing and Atmospheric algorithms
angle_output_units=global_inclined_irradiance_series.solar_incidence.unit, # Maybe get from surface_[prientation|tilt] ?
solar_positioning_algorithm=global_inclined_irradiance_series.solar_positioning_algorithm,
solar_timing_algorithm=global_inclined_irradiance_series.solar_timing_algorithm,
adjusted_for_atmospheric_refraction=global_inclined_irradiance_series.adjusted_for_atmospheric_refraction,
solar_incidence_model=global_inclined_irradiance_series.solar_incidence_model,
solar_incidence_definition=global_inclined_irradiance_series.solar_incidence.definition,
# SOLAR_CONSTANT_COLUMN_NAME: solar_constant,
# ECCENTRICITY_PHASE_OFFSET_COLUMN_NAME: eccentricity_phase_offset,
# ECCENTRICITY_CORRECTION_FACTOR_COLUMN_NAME: eccentricity_amplitude,
shading_algorithm=global_inclined_irradiance_series.shading_algorithm,
shading_states=shading_states,
#
## Sources
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
photovoltaic_power.build_output(
verbose=verbose,
fingerprint=fingerprint,
angle_output_units=angle_output_units,
)
if profile:
import io
import pstats
pr.disable()
# write profiling statistics to file
profile_filename = "profiling_stats.prof"
pr.dump_stats(profile_filename)
print(f"Profiling statistics saved to {profile_filename}")
s = io.StringIO()
sortby = pstats.SortKey.CUMULATIVE
ps = pstats.Stats(pr, stream=s).sort_stats(sortby)
ps.print_stats()
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
print(s.getvalue())
log_data_fingerprint(
data=photovoltaic_power_output_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return photovoltaic_power
broadband_multiple_surfaces ¶
Functions:
| Name | Description |
|---|---|
calculate_photovoltaic_power_output_series_from_multiple_surfaces | Estimate the total photovoltaic power for multiple solar surfaces. |
calculate_photovoltaic_power_output_series_from_multiple_surfaces ¶
calculate_photovoltaic_power_output_series_from_multiple_surfaces(
longitude: float,
latitude: float,
elevation: float,
timestamps: DatetimeIndex | None = DatetimeIndex(
[now(tz="UTC")]
),
timezone: ZoneInfo | None = ZoneInfo("UTC"),
global_horizontal_irradiance: Path | None = None,
direct_horizontal_irradiance: Path | None = None,
spectral_factor_series: SpectralFactorSeries = SpectralFactorSeries(
value=SPECTRAL_FACTOR_DEFAULT
),
temperature_series: ndarray = array(
TEMPERATURE_DEFAULT
),
wind_speed_series: ndarray = array(WIND_SPEED_DEFAULT),
surface_orientations: list[float] = [
SURFACE_ORIENTATION_DEFAULT
],
surface_tilts: list[float] = [SURFACE_TILT_DEFAULT],
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
albedo: float | None = ALBEDO_DEFAULT,
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
sun_horizon_position: List[
SunHorizonPositionModel
] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_model: SolarIncidenceModel = jenco,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
shading_states: List[ShadingState] = [all],
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
photovoltaic_module_type: PhotovoltaicModuleType = Monofacial,
bifaciality_factor: float = 0.3,
photovoltaic_module: PhotovoltaicModuleModel = PHOTOVOLTAIC_MODULE_DEFAULT,
peak_power: float = 1,
system_efficiency: (
float | None
) = SYSTEM_EFFICIENCY_DEFAULT,
power_model: PhotovoltaicModulePerformanceModel = None,
radiation_cutoff_threshold: float = RADIATION_CUTOFF_THRESHHOLD,
temperature_model: ModuleTemperatureAlgorithm = None,
efficiency: float | None = EFFICIENCY_FACTOR_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
angle_output_units: str = RADIANS,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
profile: bool = cPROFILE_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
)
Estimate the total photovoltaic power for multiple solar surfaces.
Estimate the total photovoltaic power for multiple solar surfaces, i.e. different pairs of surface orientation and tilt angles) over a time series or an arbitrarily aggregated energy production of a PV system based on the effective solar irradiance incident on a solar surface, the ambient temperature and optionally wind speed.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
longitude | float | The longitude of the location for which the energy production is calculated. | required |
latitude | float | The latitude of the location. | required |
elevation | float | Elevation of the location in meters. | required |
timestamps | DatetimeIndex | Specific timestamps for which to calculate the irradiance. Default is None. | DatetimeIndex([now(tz='UTC')]) |
timezone | str | Timezone of the location, by default None | ZoneInfo('UTC') |
global_horizontal_irradiance | Path | None | Path to global horizontal irradiance, by default None | None |
direct_horizontal_irradiance | Path | None | Path to direct horizontal irradiance, by default None | None |
spectral_factor_series | SpectralFactorSeries | Spectral factor values, by default SpectralFactorSeries(value=SPECTRAL_FACTOR_DEFAULT) | SpectralFactorSeries(value=SPECTRAL_FACTOR_DEFAULT) |
temperature_series | ndarray | Series of temperature values, by default np.array(TEMPERATURE_DEFAULT) | array(TEMPERATURE_DEFAULT) |
wind_speed_series | ndarray | Series of wind speed values, by default np.array(WIND_SPEED_DEFAULT) | array(WIND_SPEED_DEFAULT) |
dtype | str | Datatype, by default DATA_TYPE_DEFAULT | DATA_TYPE_DEFAULT |
array_backend | str | Array backend option, by default ARRAY_BACKEND_DEFAULT | ARRAY_BACKEND_DEFAULT |
multi_thread | bool | Calculations with multithread, by default True | MULTI_THREAD_FLAG_DEFAULT |
surface_orientations | list[float] | List of orientation values, by default [SURFACE_ORIENTATION_DEFAULT] | [SURFACE_ORIENTATION_DEFAULT] |
surface_tilts | list[float] | List of tilt values, by default [SURFACE_TILT_DEFAULT] | [SURFACE_TILT_DEFAULT] |
linke_turbidity_factor_series | LinkeTurbidityFactor | Linke turbidity factor values, by default [LinkeTurbidityFactor()] | LinkeTurbidityFactor() |
refracted_solar_zenith | float | None | Apply atmospheric refraction option, by default UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT | required |
albedo | float | None | Albedo, by default ALBEDO_DEFAULT | ALBEDO_DEFAULT |
apply_reflectivity_factor | bool | Apply angular loss fator, by default True | ANGULAR_LOSS_FACTOR_FLAG_DEFAULT |
solar_position_model | SolarPositionModel | Solar position model, by default SOLAR_POSITION_ALGORITHM_DEFAULT | SOLAR_POSITION_ALGORITHM_DEFAULT |
solar_incidence_model | SolarIncidenceModel | Solar incidence model, by default SolarIncidenceModel.jenco | jenco |
solar_time_model | SolarTimeModel | Solar time model, by default SOLAR_TIME_ALGORITHM_DEFAULT | SOLAR_TIME_ALGORITHM_DEFAULT |
solar_constant | float | Solar constant, by default SOLAR_CONSTANT | SOLAR_CONSTANT |
eccentricity_phase_offset | float | Perigee offset value, by default ECCENTRICITY_PHASE_OFFSET | ECCENTRICITY_PHASE_OFFSET |
eccentricity_amplitude | float | Eccentricity correction factor, by default ECCENTRICITY_CORRECTION_FACTOR | ECCENTRICITY_CORRECTION_FACTOR |
angle_output_units | str | Angle output units, by default RADIANS | RADIANS |
photovoltaic_module | PhotovoltaicModuleModel | Photovoltaic module, by default PHOTOVOLTAIC_MODULE_DEFAULT | PHOTOVOLTAIC_MODULE_DEFAULT |
system_efficiency | float | None | System efficiency, by default SYSTEM_EFFICIENCY_DEFAULT | SYSTEM_EFFICIENCY_DEFAULT |
power_model | PhotovoltaicModulePerformanceModel | Power model, by default None | None |
temperature_model | ModuleTemperatureAlgorithm | Temperature model, by default None | None |
efficiency | float | None | Module efficiency value, by default None | EFFICIENCY_FACTOR_DEFAULT |
verbose | int | Verbosing level, by default VERBOSE_LEVEL_DEFAULT | VERBOSE_LEVEL_DEFAULT |
log | int | Logging level, by default 0 | LOG_LEVEL_DEFAULT |
fingerprint | bool | Include output fingerprint, by default False | FINGERPRINT_FLAG_DEFAULT |
profile | bool | Include profile, by default False | cPROFILE_FLAG_DEFAULT |
validate_output | bool | Perform validation on the output of each function | VALIDATE_OUTPUT_DEFAULT |
Returns:
| Type | Description |
|---|---|
PhotovoltaicPowerMultipleModules | Summary of array of effective irradiance values. |
Source code in pvgisprototype/api/power/broadband_multiple_surfaces.py
@log_function_call
def calculate_photovoltaic_power_output_series_from_multiple_surfaces(
longitude: float,
latitude: float,
elevation: float,
timestamps: DatetimeIndex | None = DatetimeIndex([Timestamp.now(tz="UTC")]),
timezone: ZoneInfo | None = ZoneInfo("UTC"),
global_horizontal_irradiance: Path | None = None,
direct_horizontal_irradiance: Path | None = None,
spectral_factor_series: SpectralFactorSeries = SpectralFactorSeries(
value=SPECTRAL_FACTOR_DEFAULT
),
temperature_series: np.ndarray = np.array(TEMPERATURE_DEFAULT),
wind_speed_series: np.ndarray = np.array(WIND_SPEED_DEFAULT),
surface_orientations: list[float] = [SURFACE_ORIENTATION_DEFAULT],
surface_tilts: list[float] = [SURFACE_TILT_DEFAULT],
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
# unrefracted_solar_zenith: UnrefractedSolarZenith | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
albedo: float | None = ALBEDO_DEFAULT,
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
sun_horizon_position: List[SunHorizonPositionModel] = SUN_HORIZON_POSITION_DEFAULT,
#
solar_incidence_model: SolarIncidenceModel = SolarIncidenceModel.jenco,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
#
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
shading_states: List[ShadingState] = [ShadingState.all],
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
#
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
#
photovoltaic_module_type: PhotovoltaicModuleType = PhotovoltaicModuleType.Monofacial, # Leave Me Like This !
bifaciality_factor: float = 0.3, # 0.7, # Fixed !
photovoltaic_module: PhotovoltaicModuleModel = PHOTOVOLTAIC_MODULE_DEFAULT,
peak_power: float = 1,
#
system_efficiency: float | None = SYSTEM_EFFICIENCY_DEFAULT,
power_model: PhotovoltaicModulePerformanceModel = None,
radiation_cutoff_threshold: float = RADIATION_CUTOFF_THRESHHOLD,
temperature_model: ModuleTemperatureAlgorithm = None,
efficiency: float | None = EFFICIENCY_FACTOR_DEFAULT,
#
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
#
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
angle_output_units: str = RADIANS,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
profile: bool = cPROFILE_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
):
"""Estimate the total photovoltaic power for multiple solar surfaces.
Estimate the total photovoltaic power for multiple solar surfaces, i.e.
different pairs of surface orientation and tilt angles) over a time series
or an arbitrarily aggregated energy production of a PV system based on the
effective solar irradiance incident on a solar surface, the ambient
temperature and optionally wind speed.
Parameters
----------
longitude : float
The longitude of the location for which the energy production is calculated.
latitude : float
The latitude of the location.
elevation : float
Elevation of the location in meters.
timestamps : DatetimeIndex, optional
Specific timestamps for which to calculate the irradiance. Default is None.
timezone : str, optional
Timezone of the location, by default None
global_horizontal_irradiance : Path | None, optional
Path to global horizontal irradiance, by default None
direct_horizontal_irradiance : Path | None, optional
Path to direct horizontal irradiance, by default None
spectral_factor_series : SpectralFactorSeries, optional
Spectral factor values, by default SpectralFactorSeries(value=SPECTRAL_FACTOR_DEFAULT)
temperature_series : np.ndarray, optional
Series of temperature values, by default np.array(TEMPERATURE_DEFAULT)
wind_speed_series : np.ndarray, optional
Series of wind speed values, by default np.array(WIND_SPEED_DEFAULT)
dtype : str, optional
Datatype, by default DATA_TYPE_DEFAULT
array_backend : str, optional
Array backend option, by default ARRAY_BACKEND_DEFAULT
multi_thread : bool, optional
Calculations with multithread, by default True
surface_orientations : list[float], optional
List of orientation values, by default [SURFACE_ORIENTATION_DEFAULT]
surface_tilts : list[float], optional
List of tilt values, by default [SURFACE_TILT_DEFAULT]
linke_turbidity_factor_series : LinkeTurbidityFactor, optional
Linke turbidity factor values, by default [LinkeTurbidityFactor()]
refracted_solar_zenith : float | None, optional
Apply atmospheric refraction option, by default UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT
albedo : float | None, optional
Albedo, by default ALBEDO_DEFAULT
apply_reflectivity_factor : bool, optional
Apply angular loss fator, by default True
solar_position_model : SolarPositionModel, optional
Solar position model, by default SOLAR_POSITION_ALGORITHM_DEFAULT
solar_incidence_model : SolarIncidenceModel, optional
Solar incidence model, by default SolarIncidenceModel.jenco
solar_time_model : SolarTimeModel, optional
Solar time model, by default SOLAR_TIME_ALGORITHM_DEFAULT
solar_constant : float, optional
Solar constant, by default SOLAR_CONSTANT
eccentricity_phase_offset : float, optional
Perigee offset value, by default ECCENTRICITY_PHASE_OFFSET
eccentricity_amplitude : float, optional
Eccentricity correction factor, by default ECCENTRICITY_CORRECTION_FACTOR
angle_output_units : str, optional
Angle output units, by default RADIANS
photovoltaic_module : PhotovoltaicModuleModel, optional
Photovoltaic module, by default PHOTOVOLTAIC_MODULE_DEFAULT
system_efficiency : float | None, optional
System efficiency, by default SYSTEM_EFFICIENCY_DEFAULT
power_model : PhotovoltaicModulePerformanceModel, optional
Power model, by default None
temperature_model : ModuleTemperatureAlgorithm, optional
Temperature model, by default None
efficiency : float | None, optional
Module efficiency value, by default None
verbose : int, optional
Verbosing level, by default VERBOSE_LEVEL_DEFAULT
log : int, optional
Logging level, by default 0
fingerprint : bool, optional
Include output fingerprint, by default False
profile : bool, optional
Include profile, by default False
validate_output: bool, optional
Perform validation on the output of each function
Returns
-------
PhotovoltaicPowerMultipleModules
Summary of array of effective irradiance values.
"""
profiler = setup_profiler(enable_profiling=profile)
pairs_of_surface_orientation_and_tilt_angles = (
{SolarSurfacePositionParameter.surface_orientation.name: orientation,
SolarSurfacePositionParameter.surface_tilt.name: tilt}
for orientation, tilt in zip(surface_orientations, surface_tilts)
)
sun_horizon_positions = select_models(SunHorizonPositionModel, sun_horizon_position)
common_parameters = {
"longitude": longitude,
"latitude": latitude,
"elevation": elevation,
#
"timestamps": timestamps,
"timezone": timezone,
#
"global_horizontal_irradiance": global_horizontal_irradiance,
"direct_horizontal_irradiance": direct_horizontal_irradiance,
"spectral_factor_series": spectral_factor_series,
"temperature_series": temperature_series,
"wind_speed_series": wind_speed_series,
#
"linke_turbidity_factor_series": linke_turbidity_factor_series,
"adjust_for_atmospheric_refraction": adjust_for_atmospheric_refraction,
# "unrefracted_solar_zenith": unrefracted_solar_zenith,
"albedo": albedo,
#
"apply_reflectivity_factor": apply_reflectivity_factor,
"solar_position_model": solar_position_model,
"sun_horizon_position": sun_horizon_position,
#
"solar_incidence_model": solar_incidence_model,
"zero_negative_solar_incidence_angle": zero_negative_solar_incidence_angle,
#
"horizon_profile": horizon_profile,
"shading_model": shading_model,
"shading_states": shading_states,
"solar_time_model": solar_time_model,
#
"solar_constant": solar_constant,
"eccentricity_phase_offset": eccentricity_phase_offset,
"eccentricity_amplitude": eccentricity_amplitude,
#
"photovoltaic_module_type": photovoltaic_module_type,
"bifaciality_factor": bifaciality_factor,
"photovoltaic_module": photovoltaic_module,
"peak_power": peak_power,
#
"system_efficiency": system_efficiency,
"power_model": power_model,
"radiation_cutoff_threshold": radiation_cutoff_threshold,
"temperature_model": temperature_model,
"efficiency": efficiency,
#
"dtype": dtype,
"array_backend": array_backend,
#
# "angle_output_units": angle_output_units,
"validate_output": validate_output,
"verbose": VERBOSE_LEVEL_MULTI_MODULE_DEFAULT,
"log": log,
"fingerprint": fingerprint,
"profile": profile,
}
if multi_thread:
from functools import partial
from multiprocessing.pool import ThreadPool as Pool
pool = Pool()
partial_calculate_photovoltaic_power_output_series = partial(
calculate_photovoltaic_power_output_series, **common_parameters
)
individual_photovoltaic_power_outputs = pool.map(
lambda args: partial_calculate_photovoltaic_power_output_series(**args),
pairs_of_surface_orientation_and_tilt_angles,
)
pool.close()
else:
individual_photovoltaic_power_outputs = [
calculate_photovoltaic_power_output_series(
**common_parameters,
**surface_position_angles,
)
for surface_position_angles in pairs_of_surface_orientation_and_tilt_angles
]
array_parameters = {
"shape": timestamps.shape,
"dtype": dtype,
"init_method": "zeros",
"backend": array_backend,
} # Borrow shape from timestamps
# Irradiance after reflectivity ?
total_global_inclined_irradiance = create_array(**array_parameters)
total_direct_inclined_irradiance = create_array(**array_parameters)
total_diffuse_inclined_irradiance = create_array(**array_parameters)
# In-plane (or inclined) irradiance **before reflectivity**
total_direct_inclined_irradiance_before_reflectivity = create_array(
**array_parameters
)
total_diffuse_inclined_irradiance_before_reflectivity = create_array(
**array_parameters
)
total_ground_reflected_inclined_irradiance_before_reflectivity = create_array(
**array_parameters
)
# sum of the above three =
total_global_inclined_irradiance_before_reflectivity = create_array(
**array_parameters
)
# reflectivity effect factor as a function of the incidence angle
total_direct_inclined_reflectivity_factor = create_array(**array_parameters)
total_diffuse_inclined_reflectivity_factor = create_array(**array_parameters)
total_ground_reflected_inclined_reflectivity_factor = create_array(
**array_parameters
)
# ... --- Does this make sense at this point ?
total_global_inclined_reflected = create_array(**array_parameters)
# after the reflectivity effect
total_ground_reflected_inclined_irradiance = create_array(**array_parameters)
total_direct_horizontal_irradiance = DirectHorizontalIrradiance(
value= create_array(**array_parameters),
# out_of_range=out_of_range,
# out_of_range_index=out_of_range_index,
elevation=elevation,
# solar_altitude=solar_altitude_series,
# refracted_solar_altitude=refracted_solar_altitude_series.value,
# optical_air_mass=optical_air_mass_series,
# direct_normal_irradiance=direct_normal_irradiance_series,
# surface_in_shade=surface_in_shade_series,
# solar_radiation_model=HOFIERKA_2002,
# data_source=HOFIERKA_2002,
)
total_diffuse_horizontal_irradiance = DiffuseSkyReflectedHorizontalIrradiance(
value= create_array(**array_parameters),
# out_of_range=out_of_range,
# out_of_range_index=out_of_range_index,
# extraterrestrial_normal_irradiance=extraterrestrial_normal_irradiance_series,
linke_turbidity_factor=linke_turbidity_factor_series,
# solar_altitude=solar_altitude_series,
# solar_positioning_algorithm=solar_altitude_series.solar_positioning_algorithm,
# adjust_for_atmospheric_refraction=solar_altitude_series.adjusted_for_atmospheric_refraction,
)
global_irradiance_series = create_array(**array_parameters)
#
photovoltaic_power_output_without_system_loss_series = create_array(
**array_parameters
)
photovoltaic_power_output_series = create_array(**array_parameters)
# same for all years, applies to global or any component
total_spectral_effect = create_array(**array_parameters)
total_effective_direct_irradiance = create_array(**array_parameters)
total_effective_diffuse_irradiance = create_array(**array_parameters)
total_effective_reflected_inclined_irradiance = create_array(**array_parameters)
# sum of above three
total_effective_global_irradiance = create_array(**array_parameters)
# sun_horizon_positions = []
for photovoltaic_power_output in individual_photovoltaic_power_outputs:
photovoltaic_power_output_series += photovoltaic_power_output.value
global_irradiance_series += photovoltaic_power_output.global_inclined_irradiance
photovoltaic_power_output_without_system_loss_series += (
photovoltaic_power_output.photovoltaic_power_without_system_loss
)
total_effective_global_irradiance += (
photovoltaic_power_output.effective_global_irradiance
)
total_effective_direct_irradiance += (
photovoltaic_power_output.effective_direct_irradiance
)
total_effective_diffuse_irradiance += (
photovoltaic_power_output.effective_diffuse_irradiance
)
total_effective_reflected_inclined_irradiance += (
photovoltaic_power_output.effective_ground_reflected_irradiance
)
total_spectral_effect += photovoltaic_power_output.spectral_effect
if apply_reflectivity_factor:
# the amount after the reflectivity effect !
total_global_inclined_reflected += (
photovoltaic_power_output.global_inclined_reflected
)
total_direct_inclined_reflectivity_factor += (
photovoltaic_power_output.direct_inclined_reflectivity_factor
)
total_diffuse_inclined_reflectivity_factor += (
photovoltaic_power_output.diffuse_inclined_reflectivity_factor
)
total_ground_reflected_inclined_reflectivity_factor += (
photovoltaic_power_output.ground_reflected_inclined_reflectivity_factor
)
total_global_inclined_irradiance += (
photovoltaic_power_output.global_inclined_irradiance
)
total_direct_inclined_irradiance += (
photovoltaic_power_output.direct_inclined_irradiance
)
total_diffuse_inclined_irradiance += (
photovoltaic_power_output.diffuse_inclined_irradiance
)
total_ground_reflected_inclined_irradiance += (
photovoltaic_power_output.ground_reflected_inclined_irradiance
)
# Irradiance before reflectivity effect
total_global_inclined_irradiance_before_reflectivity += (
photovoltaic_power_output.ground_reflected_inclined_before_reflectivity
if apply_reflectivity_factor
else photovoltaic_power_output.global_inclined_irradiance
)
total_direct_inclined_irradiance_before_reflectivity += (
photovoltaic_power_output.direct_inclined_before_reflectivity
if apply_reflectivity_factor
else photovoltaic_power_output.direct_inclined_irradiance
)
total_diffuse_inclined_irradiance_before_reflectivity += (
photovoltaic_power_output.diffuse_inclined_before_reflectivity
if apply_reflectivity_factor
else photovoltaic_power_output.diffuse_inclined_irradiance
)
total_ground_reflected_inclined_irradiance_before_reflectivity += (
photovoltaic_power_output.ground_reflected_inclined_before_reflectivity
if apply_reflectivity_factor
else photovoltaic_power_output.ground_reflected_inclined_irradiance
)
total_direct_horizontal_irradiance.value += (
photovoltaic_power_output.direct_horizontal_irradiance.value
)
total_diffuse_horizontal_irradiance.value += (
photovoltaic_power_output.diffuse_horizontal_irradiance.value
)
# # Plus, some metadata
# sun_horizon_positions.append(photovoltaic_power_output.sun_horizon_positions)
total_spectral_effect_percentage = (
(total_spectral_effect / global_irradiance_series * 100)
if global_irradiance_series is not None
else 0
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
if profile:
finalise_profiling(
disable_profiling=~profile, # Inverted confusing logic !
profiler=profiler,
verbose=verbose,
)
log_data_fingerprint(
data=photovoltaic_power_output_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
photovoltaic_power = PhotovoltaicPowerMultipleModules(
value=photovoltaic_power_output_series,
modules=individual_photovoltaic_power_outputs,
individual_series=individual_photovoltaic_power_outputs,
# out_of_range=out_of_range,
# out_of_range_index=out_of_range_index,
# unit=POWER_UNIT,
technology=photovoltaic_module.value,
power_model=power_model.value,
system_efficiency=system_efficiency,
efficiency_factor=individual_photovoltaic_power_outputs[0].efficiency_factor,
temperature=temperature_series,
wind_speed=wind_speed_series,
#
## Effective Irradiance Components
effective_global_irradiance=total_effective_global_irradiance,
effective_direct_irradiance=total_effective_direct_irradiance,
effective_diffuse_irradiance=total_effective_diffuse_irradiance,
effective_ground_reflected_irradiance=total_effective_reflected_inclined_irradiance,
spectral_effect=total_spectral_effect,
spectral_effect_percentage=total_spectral_effect_percentage,
spectral_factor=spectral_factor_series,
peak_power=peak_power,
#
## Inclined Irradiance Components
global_inclined_irradiance=total_global_inclined_irradiance,
direct_inclined_irradiance=total_direct_inclined_irradiance,
diffuse_inclined_irradiance=total_diffuse_inclined_irradiance,
ground_reflected_inclined_irradiance=total_ground_reflected_inclined_irradiance,
#
## Horizontal Irradiance Components
irradiance=global_irradiance_series,
global_horizontal_irradiance=global_horizontal_irradiance,
direct_horizontal_irradiance=total_direct_horizontal_irradiance,
diffuse_horizontal_irradiance=total_diffuse_horizontal_irradiance,
#
## Components of the Extraterrestrial irradiance
extraterrestrial_horizontal_irradiance=individual_photovoltaic_power_outputs[0].extraterrestrial_horizontal_irradiance,
extraterrestrial_normal_irradiance=individual_photovoltaic_power_outputs[0].extraterrestrial_normal_irradiance,
# linke_turbidity_factor=linke_turbidity_factor_series,
#
## Location and Position
# location=,
elevation=elevation,
surface_orientations=surface_orientations,
surface_tilts=surface_tilts,
surface_position_angle_pairs=list(zip(surface_orientations, surface_tilts)),
sun_horizon_positions=sun_horizon_positions,
#
## Solar Position parameters
horizon_height=individual_photovoltaic_power_outputs[0].surface_in_shade.horizon_height,
surface_in_shade=individual_photovoltaic_power_outputs[0].surface_in_shade,
visible=individual_photovoltaic_power_outputs[0].surface_in_shade.visible,
solar_incidence=individual_photovoltaic_power_outputs[0].solar_incidence, # This is not correct !
shading_state=individual_photovoltaic_power_outputs[0].shading_state,
sun_horizon_position=individual_photovoltaic_power_outputs[0].sun_horizon_position, # positions != sun_horizon_positions
solar_altitude=individual_photovoltaic_power_outputs[0].solar_altitude,
# refracted_solar_altitude=individual_photovoltaic_power_outputs[0].refracted_solar_altitude,
solar_azimuth=individual_photovoltaic_power_outputs[0].solar_azimuth,
solar_azimuth_origin=individual_photovoltaic_power_outputs[0].solar_azimuth.origin,
# azimuth_difference=azimuth_difference_series,
#
## Positioning, Timing and Atmospheric algorithms
angle_output_units=individual_photovoltaic_power_outputs[0].solar_incidence.unit, # Maybe get from surface_[prientation|tilt] ?
# solar_positioning_algorithm=individual_photovoltaic_power_outputs[0].solar_positioning_algorithm,
solar_positioning_algorithm="",
# solar_timing_algorithm=individual_photovoltaic_power_outputs[0].solar_timing_algorithm,
solar_timing_algorithm="",
adjusted_for_atmospheric_refraction=individual_photovoltaic_power_outputs[0].adjusted_for_atmospheric_refraction,
solar_incidence_model=individual_photovoltaic_power_outputs[0].solar_incidence_model,
solar_incidence_definition=individual_photovoltaic_power_outputs[0].solar_incidence.definition,
# SOLAR_CONSTANT_COLUMN_NAME: solar_constant,
# ECCENTRICITY_PHASE_OFFSET_COLUMN_NAME: eccentricity_phase_offset,
# ECCENTRICITY_CORRECTION_FACTOR_COLUMN_NAME: eccentricity_amplitude,
shading_algorithm=individual_photovoltaic_power_outputs[0].shading_algorithm,
shading_states=shading_states,
)
photovoltaic_power.build_output(
verbose=verbose, fingerprint=fingerprint
)
return photovoltaic_power
broadband_rear_side ¶
Functions:
| Name | Description |
|---|---|
calculate_rear_side_photovoltaic_power_output_series | Estimate the photovoltaic power over a time series or an arbitrarily |
calculate_rear_side_photovoltaic_power_output_series ¶
calculate_rear_side_photovoltaic_power_output_series(
longitude: float,
latitude: float,
elevation: float,
rear_side_surface_orientation: (
SurfaceOrientation | None
) = SURFACE_ORIENTATION_DEFAULT,
rear_side_surface_tilt: (
SurfaceTilt | None
) = SURFACE_TILT_DEFAULT,
timestamps: DatetimeIndex | None = DatetimeIndex(
[now(tz="UTC")]
),
timezone: ZoneInfo = ZoneInfo("UTC"),
global_horizontal_irradiance: (
ndarray | Path | None
) = None,
direct_horizontal_irradiance: (
ndarray | Path | None
) = None,
spectral_factor_series: SpectralFactorSeries = SpectralFactorSeries(
value=SPECTRAL_FACTOR_DEFAULT
),
temperature_series: ndarray = array(
TEMPERATURE_DEFAULT
),
wind_speed_series: ndarray = array(WIND_SPEED_DEFAULT),
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
albedo: float | None = ALBEDO_DEFAULT,
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
sun_horizon_position: List[
SunHorizonPositionModel
] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_model: SolarIncidenceModel = iqbal,
zero_negative_solar_incidence_angle: bool = DO_NOT_ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
shading_states: List[ShadingState] = [all],
angle_output_units: str = RADIANS,
photovoltaic_module: PhotovoltaicModuleModel = CSI_FREE_STANDING,
peak_power: float = PEAK_POWER_DEFAULT,
system_efficiency: (
float | None
) = SYSTEM_EFFICIENCY_DEFAULT,
power_model: PhotovoltaicModulePerformanceModel = king,
radiation_cutoff_threshold: float = RADIATION_CUTOFF_THRESHHOLD,
temperature_model: ModuleTemperatureAlgorithm = faiman,
rear_side_efficiency: (
float | None
) = REAR_SIDE_EFFICIENCY_FACTOR_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
profile: bool = cPROFILE_FLAG_DEFAULT,
)
Estimate the photovoltaic power over a time series or an arbitrarily aggregated energy production of a PV system based on the effective solar irradiance incident on a solar surface, the ambient temperature and optionally wind speed.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
longitude | float | The longitude of the location for which the energy production is calculated. | required |
latitude | float | The latitude of the location. | required |
elevation | float | Elevation of the location in meters. | required |
timestamps | DatetimeIndex | Specific timestamps for which to calculate the irradiance. Default is None. | DatetimeIndex([now(tz='UTC')]) |
timezone | str | None | Timezone of the location. Default is None. | ZoneInfo('UTC') |
global_horizontal_component | Path | None | Path to data file for global horizontal irradiance. Default is None. | required |
direct_horizontal_component | Path | None | Path to data file for direct horizontal irradiance. Default is None. | required |
temperature_series | TemperatureSeries | Series of temperature values. Default is TEMPERATURE_DEFAULT. | array(TEMPERATURE_DEFAULT) |
wind_speed_series | WindSpeedSeries | Series of wind speed values. Default is WIND_SPEED_DEFAULT. | array(WIND_SPEED_DEFAULT) |
mask_and_scale | bool | If True, applies masking and scaling to the input data. | False |
Returns:
| Name | Type | Description |
|---|---|---|
photovoltaic_power_output_series | ndarray | Array of effective irradiance values. |
results | dict | Dictionary containing detailed results of the calculation. |
title | str | Title of the output data. |
Examples:
>>> calculate_photovoltaic_power_output_series(10.0, 20.0, 100.0)
# This will return the effective irradiance series, results, and title for the specified parameters.
Notes
This function is part of the Typer-based CLI for the new PVGIS implementation in Python. It provides an interface for estimating the energy production of a photovoltaic system, taking into account various environmental and system parameters.
Source code in pvgisprototype/api/power/broadband_rear_side.py
@log_function_call
def calculate_rear_side_photovoltaic_power_output_series(
longitude: float,
latitude: float,
elevation: float,
rear_side_surface_orientation: (
SurfaceOrientation | None
) = SURFACE_ORIENTATION_DEFAULT,
rear_side_surface_tilt: SurfaceTilt | None = SURFACE_TILT_DEFAULT,
timestamps: DatetimeIndex | None = DatetimeIndex([Timestamp.now(tz="UTC")]),
timezone: ZoneInfo = ZoneInfo("UTC"),
global_horizontal_irradiance: ndarray | Path | None = None,
direct_horizontal_irradiance: ndarray | Path | None = None,
spectral_factor_series: SpectralFactorSeries = SpectralFactorSeries(
value=SPECTRAL_FACTOR_DEFAULT
),
temperature_series: numpy.ndarray = numpy.array(TEMPERATURE_DEFAULT),
wind_speed_series: numpy.ndarray = numpy.array(WIND_SPEED_DEFAULT),
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
# unrefracted_solar_zenith: UnrefractedSolarZenith | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
albedo: float | None = ALBEDO_DEFAULT,
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
sun_horizon_position: List[SunHorizonPositionModel] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_model: SolarIncidenceModel = SolarIncidenceModel.iqbal,
zero_negative_solar_incidence_angle: bool = DO_NOT_ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT, # On purpose so !
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
shading_states: List[ShadingState] = [ShadingState.all], # make it a set ?
angle_output_units: str = RADIANS,
photovoltaic_module: PhotovoltaicModuleModel = PhotovoltaicModuleModel.CSI_FREE_STANDING,
# photovoltaic_module_type: PhotovoltaicModuleType = PhotovoltaicModuleType.Bifacial, # Leave Me Like This !
peak_power: float = PEAK_POWER_DEFAULT,
system_efficiency: float | None = SYSTEM_EFFICIENCY_DEFAULT,
power_model: PhotovoltaicModulePerformanceModel = PhotovoltaicModulePerformanceModel.king,
radiation_cutoff_threshold: float = RADIATION_CUTOFF_THRESHHOLD,
temperature_model: ModuleTemperatureAlgorithm = ModuleTemperatureAlgorithm.faiman,
rear_side_efficiency: float | None = REAR_SIDE_EFFICIENCY_FACTOR_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
# multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
validate_output: bool = VALIDATE_OUTPUT_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
profile: bool = cPROFILE_FLAG_DEFAULT,
):
"""
Estimate the photovoltaic power over a time series or an arbitrarily
aggregated energy production of a PV system based on the effective solar
irradiance incident on a solar surface, the ambient temperature and
optionally wind speed.
Parameters
----------
longitude : float
The longitude of the location for which the energy production is calculated.
latitude : float
The latitude of the location.
elevation : float
Elevation of the location in meters.
timestamps : DatetimeIndex, optional
Specific timestamps for which to calculate the irradiance. Default is None.
timezone : str | None, optional
Timezone of the location. Default is None.
global_horizontal_component : Path | None, optional
Path to data file for global horizontal irradiance. Default is None.
direct_horizontal_component : Path | None, optional
Path to data file for direct horizontal irradiance. Default is None.
temperature_series : TemperatureSeries
Series of temperature values. Default is TEMPERATURE_DEFAULT.
wind_speed_series : WindSpeedSeries
Series of wind speed values. Default is WIND_SPEED_DEFAULT.
mask_and_scale : bool, default False
If True, applies masking and scaling to the input data.
# ... other parameters ...
Returns
-------
photovoltaic_power_output_series : ndarray
Array of effective irradiance values.
results : dict
Dictionary containing detailed results of the calculation.
title : str
Title of the output data.
Examples
--------
>>> calculate_photovoltaic_power_output_series(10.0, 20.0, 100.0)
# This will return the effective irradiance series, results, and title for the specified parameters.
Notes
-----
This function is part of the Typer-based CLI for the new PVGIS
implementation in Python. It provides an interface for estimating the
energy production of a photovoltaic system, taking into account various
environmental and system parameters.
"""
# import click
# ctx = click.get_current_context()
# print(f"args here : {ctx.args}")
# print(f"Command here : {ctx.command}")
# print(f"Command name here : {ctx.command.name}")
# print(f"Command path here : {ctx.command_path}")
# print(f"get_parameter_source() : {ctx.get_parameter_source('temperature_series')}")
# print(f"get_usage() : {ctx.get_usage()}")
# print(f"info_name : {ctx.info_name}")
# print(f"invoked_subcommand : {ctx.invoked_subcommand}")
# print(f"meta : {ctx.meta}")
# print(f"obj : {ctx.obj}")
# print(f"params : {ctx.params}")
# print(f"parent : {ctx.parent}")
# print()
if profile:
import cProfile
pr = cProfile.Profile()
pr.enable()
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
"i Modelling the solar altitude for the given timestamps ..",
alt="i [bold]Modelling[/bold] the [magenta]solar altitude[/magenta] for the given timestamps ..",
)
solar_altitude_series = model_solar_altitude_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
# solar_time_model=solar_time_model,
# eccentricity_phase_offset=eccentricity_phase_offset,
# eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
"i Modelling the solar azimuth for the given timestamps ..",
alt="i [bold]Modelling[/bold] the [magenta]solar azimuth[/magenta] for the given timestamps ..",
)
solar_azimuth_series = model_solar_azimuth_series(
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_position_model=solar_position_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
# solar_time_model=solar_time_model,
# eccentricity_phase_offset=eccentricity_phase_offset,
# eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=0,
log=log,
validate_output=validate_output,
)
surface_in_shade_series = model_surface_in_shade_series(
horizon_profile=horizon_profile,
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
timezone=timezone,
solar_time_model=solar_time_model,
solar_position_model=solar_position_model,
shading_model=shading_model,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
validate_output=validate_output,
)
# In order to avoid unbound errors we pre-define `_series` objects
array_parameters = {
"shape": timestamps.shape,
"dtype": dtype,
"init_method": "zeros",
"backend": array_backend,
} # Borrow shape from timestamps
# direct
rear_side_direct_horizontal_irradiance_series = create_array(**array_parameters)
rear_side_direct_inclined_irradiance_series = create_array(**array_parameters)
# diffuse (== sky-reflected)
rear_side_diffuse_horizontal_irradiance_series = create_array(**array_parameters)
rear_side_diffuse_inclined_irradiance_series = create_array(**array_parameters)
# ground-reflected
# there is no ground-reflected horizontal component as such !
rear_side_ground_reflected_inclined_irradiance_series = create_array(
**array_parameters
)
# before reflectivity
rear_side_direct_inclined_irradiance_before_reflectivity_series = create_array(
**array_parameters
)
rear_side_diffuse_inclined_irradiance_before_reflectivity_series = create_array(
**array_parameters
)
rear_side_ground_reflected_inclined_irradiance_before_reflectivity_series = (
create_array(**array_parameters)
)
# reflectivity effect factor/s
rear_side_direct_inclined_reflectivity_factor_series = create_array(
**array_parameters
)
rear_side_diffuse_inclined_reflectivity_factor_series = create_array(
**array_parameters
)
rear_side_ground_reflected_inclined_reflectivity_factor_series = create_array(
**array_parameters
)
# after reflectivity effect
rear_side_direct_inclined_reflected_series = create_array(**array_parameters)
rear_side_diffuse_inclined_reflected_series = create_array(**array_parameters)
rear_side_ground_reflected_inclined_reflected_series = create_array(
**array_parameters
)
# Select which solar positions related to the horizon to process
sun_horizon_positions = select_models(
SunHorizonPositionModel, sun_horizon_position
) # Using a callback fails!
# and keep track of the position of the sun relative to the horizon
sun_horizon_position_series = create_array(
timestamps.shape, dtype="object", init_method="empty", backend=array_backend
)
# For sun below the horizon
if SunHorizonPositionModel.below in sun_horizon_positions:
mask_below_horizon = solar_altitude_series.value < 0
sun_horizon_position_series[mask_below_horizon] = [
SunHorizonPositionModel.below.value
]
if numpy.any(mask_below_horizon):
logger.debug(
f"Positions of the sun below horizon :\n{sun_horizon_position_series}",
alt=f"Positions of the sun [bold gray50]below horizon[/bold gray50] :\n{sun_horizon_position_series}",
)
rear_side_direct_inclined_irradiance_series[mask_below_horizon] = 0
rear_side_diffuse_inclined_irradiance_series[mask_below_horizon] = 0
rear_side_ground_reflected_inclined_irradiance_series[
mask_below_horizon
] = 0
# For very low sun angles
if SunHorizonPositionModel.low_angle in sun_horizon_positions:
mask_low_angle = numpy.logical_and(
solar_altitude_series.value >= 0,
solar_altitude_series.value
< solar_altitude_series.low_angle_threshold_radians,
sun_horizon_position_series == None, # operate only on unset elements
)
sun_horizon_position_series[mask_low_angle] = [
SunHorizonPositionModel.low_angle.value
]
rear_side_direct_inclined_irradiance_series[mask_low_angle] = (
0 # Direct radiation is negligible
)
if SunHorizonPositionModel.above in sun_horizon_positions:
mask_above_horizon = numpy.logical_and(
solar_altitude_series.value > 0,
sun_horizon_position_series == None, # operate only on unset elements
)
sun_horizon_position_series[mask_above_horizon] = [
SunHorizonPositionModel.above.value
]
# For sun above horizon and not in shade
mask_not_in_shade = ~surface_in_shade_series.value
# mask_above_horizon_not_in_shade = numpy.logical_and.reduce(mask_above_horizon, mask_not_in_shade)
mask_above_horizon_not_in_shade = numpy.logical_and(
mask_above_horizon,
mask_not_in_shade,
sun_horizon_position_series == None,
)
if numpy.any(mask_above_horizon_not_in_shade):
# sun_horizon_position_series[mask_above_horizon_not_in_shade] = [SunHorizonPositionModel.above.name]
logger.debug(
f"Including positions of the sun above horizon and not in shade :\n{sun_horizon_position_series}",
alt=f"Including positions of the sun [bold yellow]above horizon[/bold yellow] and [bold red]not in shade[/bold red] :\n{sun_horizon_position_series}",
)
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
"i [bold]Calculating[/bold] the [magenta]direct inclined irradiance[/magenta] for moments not in shade .."
)
rear_side_calculated_direct_inclined_irradiance_series = (
calculate_direct_inclined_irradiance(
longitude=longitude,
latitude=latitude,
elevation=elevation,
timestamps=timestamps,
timezone=timezone,
direct_horizontal_irradiance=direct_horizontal_irradiance,
surface_tilt=rear_side_surface_tilt,
surface_orientation=rear_side_surface_orientation,
linke_turbidity_factor_series=linke_turbidity_factor_series,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
apply_reflectivity_factor=apply_reflectivity_factor,
solar_position_model=solar_position_model,
solar_incidence_model=solar_incidence_model,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
horizon_profile=horizon_profile,
shading_model=shading_model,
solar_time_model=solar_time_model,
solar_constant=solar_constant,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
angle_output_units=angle_output_units,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
)
rear_side_direct_horizontal_irradiance_series = (
rear_side_calculated_direct_inclined_irradiance_series.components.get(
DIRECT_HORIZONTAL_IRRADIANCE_COLUMN_NAME,
numpy.array([]),
)
)
rear_side_direct_inclined_irradiance_series[
mask_above_horizon_not_in_shade
] = rear_side_calculated_direct_inclined_irradiance_series.value[
mask_above_horizon_not_in_shade
] # .value is the direct inclined irradiance series
rear_side_direct_inclined_irradiance_before_reflectivity_series = (
rear_side_calculated_direct_inclined_irradiance_series.components.get(
DIRECT_INCLINED_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME,
numpy.array([]),
)
)
rear_side_direct_inclined_reflectivity_factor_series = (
rear_side_calculated_direct_inclined_irradiance_series.components.get(
REFLECTIVITY_FACTOR_COLUMN_NAME, numpy.array([])
)
)
rear_side_direct_inclined_reflected_series = (
rear_side_calculated_direct_inclined_irradiance_series.components.get(
REFLECTIVITY_COLUMN_NAME, numpy.array([])
)
)
# Calculate diffuse and reflected irradiance for sun above horizon
if not numpy.any(mask_above_horizon):
logger.debug(
"i [yellow bold]Apparently there is no moment of the sun above the horizon in the requested time series![/yellow bold] "
)
else:
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
"i [bold]Calculating[/bold] the [magenta]diffuse inclined irradiance[/magenta] for daylight moments .."
)
rear_side_calculated_diffuse_inclined_irradiance_series = calculate_diffuse_inclined_irradiance(
longitude=longitude,
latitude=latitude,
elevation=elevation,
timestamps=timestamps,
timezone=timezone,
surface_tilt=rear_side_surface_tilt,
surface_orientation=rear_side_surface_orientation,
linke_turbidity_factor_series=linke_turbidity_factor_series,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
global_horizontal_irradiance=global_horizontal_irradiance, # time series optional
direct_horizontal_irradiance=direct_horizontal_irradiance, # time series, optional
apply_reflectivity_factor=apply_reflectivity_factor,
solar_position_model=solar_position_model,
solar_incidence_model=solar_incidence_model,
zero_negative_solar_incidence_angle=zero_negative_solar_incidence_angle,
horizon_profile=horizon_profile,
shading_model=shading_model,
shading_states=shading_states,
solar_time_model=solar_time_model,
solar_constant=solar_constant,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
angle_output_units=angle_output_units,
dtype=dtype,
array_backend=array_backend,
# multi_thread=multi_thread,
verbose=verbose,
log=log,
)
rear_side_diffuse_horizontal_irradiance_series = (
rear_side_calculated_diffuse_inclined_irradiance_series.components.get(
DIFFUSE_HORIZONTAL_IRRADIANCE_COLUMN_NAME,
numpy.array([]),
)
)
rear_side_diffuse_inclined_irradiance_series[mask_above_horizon] = (
rear_side_calculated_diffuse_inclined_irradiance_series.value[
mask_above_horizon
]
) # .value is the diffuse irradiance series
rear_side_diffuse_inclined_irradiance_before_reflectivity_series = (
rear_side_calculated_diffuse_inclined_irradiance_series.components.get(
DIFFUSE_INCLINED_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME,
numpy.array([]),
)
)
rear_side_diffuse_inclined_reflectivity_factor_series = (
rear_side_calculated_diffuse_inclined_irradiance_series.components.get(
REFLECTIVITY_FACTOR_COLUMN_NAME, numpy.array([])
)
)
rear_side_diffuse_inclined_reflected_series = (
rear_side_calculated_diffuse_inclined_irradiance_series.components.get(
REFLECTIVITY_COLUMN_NAME, numpy.array([])
)
)
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
"i [bold]Calculating[/bold] the [magenta]reflected inclined irradiance[/magenta] for daylight moments .."
)
rear_side_calculated_ground_reflected_inclined_irradiance_series = (
calculate_ground_reflected_inclined_irradiance_series(
longitude=longitude,
latitude=latitude,
elevation=elevation,
surface_orientation=rear_side_surface_orientation,
surface_tilt=rear_side_surface_tilt,
timestamps=timestamps,
timezone=timezone,
global_horizontal_irradiance=global_horizontal_irradiance, # optional
linke_turbidity_factor_series=linke_turbidity_factor_series,
adjust_for_atmospheric_refraction=adjust_for_atmospheric_refraction,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
albedo=albedo,
apply_reflectivity_factor=apply_reflectivity_factor,
solar_position_model=solar_position_model,
solar_time_model=solar_time_model,
solar_constant=solar_constant,
eccentricity_phase_offset=eccentricity_phase_offset,
eccentricity_amplitude=eccentricity_amplitude,
# angle_output_units=angle_output_units,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
fingerprint=fingerprint,
)
)
rear_side_ground_reflected_inclined_irradiance_series[
mask_above_horizon
] = rear_side_calculated_ground_reflected_inclined_irradiance_series.value[
mask_above_horizon
] # .value is the ground reflected irradiance series
rear_side_ground_reflected_inclined_irradiance_before_reflectivity_series = rear_side_calculated_ground_reflected_inclined_irradiance_series.components.get(
REFLECTED_INCLINED_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME,
numpy.array([]),
)
rear_side_ground_reflected_inclined_reflectivity_factor_series = rear_side_calculated_ground_reflected_inclined_irradiance_series.components.get(
REFLECTIVITY_FACTOR_COLUMN_NAME,
numpy.array([]),
)
rear_side_ground_reflected_inclined_reflected_series = rear_side_calculated_ground_reflected_inclined_irradiance_series.components.get(
REFLECTIVITY_COLUMN_NAME,
numpy.array([]),
)
# sum components
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
"\ni [bold]Calculating[/bold] the [magenta]global inclined irradiance[/magenta] .."
)
rear_side_global_inclined_irradiance_before_reflectivity_series = (
rear_side_direct_inclined_irradiance_before_reflectivity_series
+ rear_side_diffuse_inclined_irradiance_before_reflectivity_series
+ rear_side_ground_reflected_inclined_irradiance_before_reflectivity_series
)
rear_side_global_inclined_irradiance_series = (
rear_side_direct_inclined_irradiance_series
+ rear_side_diffuse_inclined_irradiance_series
+ rear_side_ground_reflected_inclined_irradiance_series
)
# Does this make sense ?
# global_inclined_reflectivity_factor_series = (
# direct_inclined_reflectivity_factor_series
# + diffuse_inclined_reflectivity_factor_series
# + ground_reflected_inclined_reflectivity_factor_series
# )
rear_side_global_inclined_reflected_series = (
rear_side_direct_inclined_reflected_series
+ rear_side_diffuse_inclined_reflected_series
+ rear_side_ground_reflected_inclined_reflected_series
)
# -----------------------------------------------------------------------
# Try the following, to deduplicate code,
# global_inclined_irradiance_series = calculate_global_inclined_irradiance_series()
# ?
# -----------------------------------------------------------------------
if not power_model:
if not rear_side_efficiency: # user-set -- RenameMe ? FIXME
rear_side_efficiency_factor_series = system_efficiency
else:
rear_side_efficiency_factor_series = rear_side_efficiency
else:
rear_side_effective_global_irradiance_series = calculate_spectrally_corrected_effective_irradiance(
irradiance_series=rear_side_global_inclined_irradiance_before_reflectivity_series,
spectral_factor_series=spectral_factor_series,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
fingerprint=fingerprint,
)
if rear_side_efficiency:
array_parameters = {
"shape": timestamps.shape,
"dtype": dtype,
"init_method": rear_side_efficiency,
"backend": array_backend,
} # Borrow shape from timestamps
# direct
rear_side_efficiency_factor_series = create_array(**array_parameters)
else:
rear_side_efficiency_series = calculate_photovoltaic_efficiency_series(
irradiance_series=rear_side_global_inclined_irradiance_series,
photovoltaic_module=photovoltaic_module,
power_model=power_model,
temperature_model=temperature_model,
# model_constants=EFFICIENCY_MODEL_COEFFICIENTS_DEFAULT,
spectral_factor_series=spectral_factor_series, # required for the Power model !
temperature_series=temperature_series,
standard_test_temperature=TEMPERATURE_DEFAULT,
wind_speed_series=wind_speed_series,
radiation_cutoff_threshold=radiation_cutoff_threshold,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
rear_side_efficiency_factor_series = rear_side_efficiency_series.value
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
"i [bold]Applying[/bold] [magenta]efficiency coefficients[/magenta] on the global inclined irradiance .."
)
# Power Model efficiency coefficients include temperature and low irradiance effect !
rear_side_photovoltaic_power_output_without_system_loss_series = (
rear_side_global_inclined_irradiance_series * rear_side_efficiency_factor_series
) # Safer to deepcopy the efficiency_series which are modified _afer_ this point ?
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
"i [bold]Applying[/bold] [magenta]system loss[/magenta] on the effective photovoltaic power .."
)
rear_side_photovoltaic_power_output_series = (
rear_side_photovoltaic_power_output_without_system_loss_series
* system_efficiency
)
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug("i [bold]Building the output[/bold] ..")
components_container = {
REAR_SIDE_PHOTOVOLTAIC_POWER_NAME: lambda: {
TITLE_KEY_NAME: REAR_SIDE_PHOTOVOLTAIC_POWER_NAME,
REAR_SIDE_PHOTOVOLTAIC_POWER_COLUMN_NAME: rear_side_photovoltaic_power_output_series,
PHOTOVOLTAIC_MODULE_TYPE_NAME: PhotovoltaicModuleType.Bifacial,
TECHNOLOGY_NAME: photovoltaic_module.value,
PEAK_POWER_COLUMN_NAME: peak_power,
PEAK_POWER_UNIT_NAME: PEAK_POWER_UNIT,
POWER_MODEL_COLUMN_NAME: (
power_model.value if power_model else NOT_AVAILABLE
),
}, # if verbose > 0 else {},
"Power extended": lambda: (
{
REAR_SIDE_PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME: rear_side_photovoltaic_power_output_without_system_loss_series,
}
if verbose > 1
else {}
),
"System loss": lambda: (
{
REAR_SIDE_EFFICIENCY_COLUMN_NAME: rear_side_efficiency_factor_series,
SYSTEM_EFFICIENCY_COLUMN_NAME: system_efficiency,
}
if verbose > 2
else {}
),
"Effective irradiance": lambda: (
{
TITLE_KEY_NAME: REAR_SIDE_PHOTOVOLTAIC_POWER_NAME
+ " & effective components",
REAR_SIDE_EFFECTIVE_GLOBAL_IRRADIANCE_COLUMN_NAME: rear_side_global_inclined_irradiance_series
* rear_side_efficiency_factor_series,
REAR_SIDE_EFFECTIVE_DIRECT_IRRADIANCE_COLUMN_NAME: rear_side_direct_inclined_irradiance_series
* rear_side_efficiency_factor_series,
REAR_SIDE_EFFECTIVE_DIFFUSE_IRRADIANCE_COLUMN_NAME: rear_side_diffuse_inclined_irradiance_series
* rear_side_efficiency_factor_series,
REAR_SIDE_EFFECTIVE_REFLECTED_IRRADIANCE_COLUMN_NAME: rear_side_ground_reflected_inclined_irradiance_series
* rear_side_efficiency_factor_series,
REAR_SIDE_SPECTRAL_EFFECT_COLUMN_NAME: rear_side_effective_global_irradiance_series.components.get(
SPECTRAL_EFFECT_COLUMN_NAME, numpy.array([])
),
REAR_SIDE_SPECTRAL_EFFECT_PERCENTAGE_COLUMN_NAME: rear_side_effective_global_irradiance_series.components.get(
SPECTRAL_EFFECT_PERCENTAGE_COLUMN_NAME, numpy.array([])
),
REAR_SIDE_SPECTRAL_FACTOR_COLUMN_NAME: rear_side_effective_global_irradiance_series.components.get(
SPECTRAL_FACTOR_COLUMN_NAME, numpy.array([])
),
}
if verbose > 3
else {}
),
"Reflectivity": lambda: (
{
REAR_SIDE_REFLECTIVITY_COLUMN_NAME: rear_side_global_inclined_reflected_series,
# REFLECTIVITY_PERCENTAGE_COLUMN_NAME: global_inclined_reflectivity_loss_percentage_series if global_inclined_reflectivity_loss_percentage_series.size > 1 else NOT_AVAILABLE,
# REFLECTIVITY_FACTOR_COLUMN_NAME: global_reflectivity_factor_series if global_reflectivity_factor_series.size > 1 else NOT_AVAILABLE,
REAR_SIDE_DIRECT_INCLINED_IRRADIANCE_REFLECTIVITY_COLUMN_NAME: rear_side_direct_inclined_reflectivity_factor_series,
REAR_SIDE_DIFFUSE_INCLINED_IRRADIANCE_REFLECTIVITY_COLUMN_NAME: rear_side_diffuse_inclined_reflectivity_factor_series,
REAR_SIDE_REFLECTED_INCLINED_IRRADIANCE_REFLECTIVITY_COLUMN_NAME: rear_side_ground_reflected_inclined_reflectivity_factor_series,
}
if verbose > 6 and apply_reflectivity_factor
else {}
),
"Inclined irradiance components": lambda: (
{
REAR_SIDE_GLOBAL_INCLINED_IRRADIANCE_COLUMN_NAME: rear_side_global_inclined_irradiance_series,
REAR_SIDE_DIRECT_INCLINED_IRRADIANCE_COLUMN_NAME: rear_side_direct_inclined_irradiance_series,
REAR_SIDE_DIFFUSE_INCLINED_IRRADIANCE_COLUMN_NAME: rear_side_diffuse_inclined_irradiance_series,
REAR_SIDE_REFLECTED_INCLINED_IRRADIANCE_COLUMN_NAME: rear_side_ground_reflected_inclined_irradiance_series,
}
if verbose > 4
else {}
),
"more_extended_2": lambda: (
{
TITLE_KEY_NAME: REAR_SIDE_PHOTOVOLTAIC_POWER_NAME
+ ", effective & in-plane components",
REAR_SIDE_GLOBAL_INCLINED_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: rear_side_global_inclined_irradiance_before_reflectivity_series,
REAR_SIDE_DIRECT_INCLINED_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: rear_side_direct_inclined_irradiance_before_reflectivity_series,
REAR_SIDE_DIFFUSE_INCLINED_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: rear_side_diffuse_inclined_irradiance_before_reflectivity_series,
REAR_SIDE_REFLECTED_INCLINED_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME: rear_side_ground_reflected_inclined_irradiance_before_reflectivity_series,
}
if verbose > 5 and apply_reflectivity_factor
else {}
),
"Horizontal irradiance components": lambda: (
{
REAR_SIDE_DIRECT_HORIZONTAL_IRRADIANCE_COLUMN_NAME: rear_side_direct_horizontal_irradiance_series,
REAR_SIDE_DIFFUSE_HORIZONTAL_IRRADIANCE_COLUMN_NAME: rear_side_diffuse_horizontal_irradiance_series,
# Rear-side Ground-Reflected Horizontal Irradiance should be zero for horizontal surfaces !?
}
if verbose > 6
else {}
),
"Meteorological variables": lambda: (
{
TEMPERATURE_COLUMN_NAME: temperature_series.value,
WIND_SPEED_COLUMN_NAME: wind_speed_series.value,
}
if verbose > 7
else {}
),
"Solar position": lambda: (
{
INCIDENCE_COLUMN_NAME: (
rear_side_calculated_direct_inclined_irradiance_series.components[
INCIDENCE_COLUMN_NAME
]
if rear_side_calculated_direct_inclined_irradiance_series.components
else NOT_AVAILABLE
),
ALTITUDE_COLUMN_NAME: getattr(
solar_altitude_series, angle_output_units
),
AZIMUTH_COLUMN_NAME: getattr(solar_azimuth_series, angle_output_units),
SUN_HORIZON_POSITION_COLUMN_NAME: sun_horizon_position_series,
}
if verbose > 9
else {}
),
"Surface Position Metadata": lambda: (
{
REAR_SIDE_SURFACE_ORIENTATION_COLUMN_NAME: convert_float_to_degrees_if_requested(
rear_side_surface_orientation, angle_output_units
),
REAR_SIDE_SURFACE_TILT_COLUMN_NAME: convert_float_to_degrees_if_requested(
rear_side_surface_tilt, angle_output_units
),
SHADING_ALGORITHM_COLUMN_NAME: (
surface_in_shade_series.shading_algorithm
if horizon_profile is not None
else "Not performed"
),
SHADING_STATES_COLUMN_NAME: (
shading_states if shading_states else NOT_AVAILABLE
),
}
if verbose # > 8
else {}
),
"Surface position": lambda: (
{
SURFACE_IN_SHADE_COLUMN_NAME: surface_in_shade_series.value,
}
if verbose > 1
else {}
),
"Solar Position Metadata": lambda: {
UNIT_NAME: angle_output_units,
INCIDENCE_ALGORITHM_COLUMN_NAME: (
rear_side_calculated_direct_inclined_irradiance_series.components[
INCIDENCE_ALGORITHM_COLUMN_NAME
]
if rear_side_calculated_direct_inclined_irradiance_series.components
else NOT_AVAILABLE
),
INCIDENCE_DEFINITION: (
rear_side_calculated_direct_inclined_irradiance_series.components[
INCIDENCE_DEFINITION
]
if rear_side_calculated_direct_inclined_irradiance_series.components
else NOT_AVAILABLE
),
SUN_HORIZON_POSITIONS_NAME: sun_horizon_positions, # Requested positions
AZIMUTH_ORIGIN_COLUMN_NAME: getattr(solar_azimuth_series, "origin"),
POSITION_ALGORITHM_COLUMN_NAME: solar_altitude_series.solar_positioning_algorithm,
TIME_ALGORITHM_COLUMN_NAME: solar_altitude_series.solar_timing_algorithm,
SOLAR_CONSTANT_COLUMN_NAME: solar_constant,
ECCENTRICITY_PHASE_OFFSET_COLUMN_NAME: eccentricity_phase_offset,
ECCENTRICITY_CORRECTION_FACTOR_COLUMN_NAME: eccentricity_amplitude,
# ABOVE_HORIZON_COLUMN_NAME: mask_above_horizon,
# LOW_ANGLE_COLUMN_NAME: mask_low_angle,
# BELOW_HORIZON_COLUMN_NAME: mask_below_horizon,
},
"Fingerprint": lambda: (
{
FINGERPRINT_COLUMN_NAME: generate_hash(
rear_side_photovoltaic_power_output_series
),
}
if fingerprint
else {}
),
}
components = {}
for _, component in components_container.items():
components.update(component())
# Overwrite the direct irradiance 'components' with the global ones !
# components = components | calculated_direct_inclined_irradiance_series.components
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
if profile:
import io
import pstats
pr.disable()
# write profiling statistics to file
profile_filename = "profiling_stats.prof"
pr.dump_stats(profile_filename)
print(f"Profiling statistics saved to {profile_filename}")
s = io.StringIO()
sortby = pstats.SortKey.CUMULATIVE
ps = pstats.Stats(pr, stream=s).sort_stats(sortby)
ps.print_stats()
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
print(s.getvalue())
log_data_fingerprint(
data=rear_side_photovoltaic_power_output_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return PhotovoltaicPower(
value=rear_side_photovoltaic_power_output_series,
unit=POWER_UNIT,
solar_ositioning_algorithm="",
solar_timing_algorithm="",
elevation=elevation,
surface_orientation=rear_side_surface_orientation,
surface_tilt=rear_side_surface_tilt,
irradiance=rear_side_global_inclined_irradiance_series,
components=components,
)
efficiency ¶
Functions:
| Name | Description |
|---|---|
calculate_photovoltaic_efficiency_series | Calculate the photovoltaic (PV) module efficiency for a time series. |
calculate_photovoltaic_efficiency_series ¶
calculate_photovoltaic_efficiency_series(
irradiance_series: InclinedIrradiance,
photovoltaic_module: PhotovoltaicModuleModel = CSI_FREE_STANDING,
photovoltaic_module_type: PhotovoltaicModuleType = Monofacial,
bifaciality_factor: float = 0.3,
power_model: PhotovoltaicModulePerformanceModel = king,
spectral_factor_series: SpectralFactorSeries = SpectralFactorSeries(
value=SPECTRAL_FACTOR_DEFAULT
),
radiation_cutoff_threshold: float = RADIATION_CUTOFF_THRESHHOLD,
temperature_model: ModuleTemperatureAlgorithm = faiman,
temperature_series: TemperatureSeries = TemperatureSeries(
value=TEMPERATURE_DEFAULT
),
standard_test_temperature: float = TEMPERATURE_DEFAULT,
wind_speed_series: WindSpeedSeries = array(
WIND_SPEED_DEFAULT
),
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> PhotovoltaicModuleEfficiency
Calculate the photovoltaic (PV) module efficiency for a time series.
Calculate the photovoltaic (PV) module efficiency for a time series based on solar irradiance and PV technology-specific efficiency coefficients, the spectral effect factor, temperature and wind speed including detailed gain/loss report.
The spectral effect arises from differences between the sunlight spectrum and standardised artificial light spectrum.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
irradiance_series | List[float] | List of irradiance values over time. | required |
spectral_factor_series | SpectralFactorSeries | List of spectral factors corresponding to the irradiance series. | SpectralFactorSeries(value=SPECTRAL_FACTOR_DEFAULT) |
model_constants | List[float] | List of coefficients for the efficiency model. Default is EFFICIENCY_MODEL_COEFFICIENTS_DEFAULT. | required |
temperature_model | ModuleTemperatureAlgorithm | Algorithm used for temperature correction. Default is ModuleTemperatureAlgorithm.faiman. | faiman |
temperature_series | ndarray | Numpy array of temperature values over time. Default is np.array(TEMPERATURE_DEFAULT). | TemperatureSeries(value=TEMPERATURE_DEFAULT) |
standard_test_temperature | float | Temperature used in standard test conditions. Default is TEMPERATURE_DEFAULT. | TEMPERATURE_DEFAULT |
wind_speed_series | ndarray | Numpy array of wind speed values over time. Default is np.array(WIND_SPEED_DEFAULT). | array(WIND_SPEED_DEFAULT) |
power_model | PhotovoltaicModulePerformanceModel | Algorithm used for calculating PV module power. Default is PhotovoltaicModulePerformanceModel.king. | king |
radiation_cutoff_threshold | float | Minimum irradiance threshold for calculations. Default is RADIATION_CUTOFF_THRESHOLD. | RADIATION_CUTOFF_THRESHHOLD |
verbose | int | Level of verbosity for output data. Default is VERBOSE_LEVEL_DEFAULT. | VERBOSE_LEVEL_DEFAULT |
Returns:
| Name | Type | Description |
|---|---|---|
efficiency_series | ndarray | Array of calculated efficiency values for the PV module. |
results | (dict, optional) | Dictionary containing detailed results and intermediate calculations. Provided when |
Raises:
| Type | Description |
|---|---|
ValueError | If an insufficient number of model constants is provided. |
Notes
Currently, external time series of monthly spectral factors are centered in the beginning of the month and applied plus/minus half a month.
Examples:
>>> calculate_pv_efficiency_series([1000, 950], [1.1, 1.05], temperature_series=np.array([25, 26]))
# Returns efficiency series and possibly detailed results based on the verbose level.
Source code in pvgisprototype/api/power/efficiency.py
@log_function_call
def calculate_photovoltaic_efficiency_series(
irradiance_series: InclinedIrradiance,
photovoltaic_module: PhotovoltaicModuleModel = PhotovoltaicModuleModel.CSI_FREE_STANDING,
photovoltaic_module_type: PhotovoltaicModuleType = PhotovoltaicModuleType.Monofacial, # Leave Me Like This !
bifaciality_factor: float = 0.3, # 0.7, # Fixed !
power_model: PhotovoltaicModulePerformanceModel = PhotovoltaicModulePerformanceModel.king,
spectral_factor_series: SpectralFactorSeries = SpectralFactorSeries(
value=SPECTRAL_FACTOR_DEFAULT
),
radiation_cutoff_threshold: float = RADIATION_CUTOFF_THRESHHOLD,
temperature_model: ModuleTemperatureAlgorithm = ModuleTemperatureAlgorithm.faiman,
temperature_series: TemperatureSeries = TemperatureSeries(
value=TEMPERATURE_DEFAULT
),
standard_test_temperature: float = TEMPERATURE_DEFAULT,
wind_speed_series: WindSpeedSeries = np.array(WIND_SPEED_DEFAULT),
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> PhotovoltaicModuleEfficiency:
"""Calculate the photovoltaic (PV) module efficiency for a time series.
Calculate the photovoltaic (PV) module efficiency for a time series based
on solar irradiance and PV technology-specific efficiency coefficients, the
spectral effect factor, temperature and wind speed including detailed
gain/loss report.
The spectral effect arises from differences between the sunlight spectrum
and standardised artificial light spectrum.
Parameters
----------
irradiance_series : List[float]
List of irradiance values over time.
spectral_factor_series: SpectralFactorSeries
List of spectral factors corresponding to the irradiance series.
model_constants : List[float], optional
List of coefficients for the efficiency model. Default is EFFICIENCY_MODEL_COEFFICIENTS_DEFAULT.
temperature_model : ModuleTemperatureAlgorithm, optional
Algorithm used for temperature correction. Default is ModuleTemperatureAlgorithm.faiman.
temperature_series : np.ndarray, optional
Numpy array of temperature values over time. Default is np.array(TEMPERATURE_DEFAULT).
standard_test_temperature : float, optional
Temperature used in standard test conditions. Default is TEMPERATURE_DEFAULT.
wind_speed_series : np.ndarray, optional
Numpy array of wind speed values over time. Default is np.array(WIND_SPEED_DEFAULT).
power_model : PhotovoltaicModulePerformanceModel, optional
Algorithm used for calculating PV module power. Default is PhotovoltaicModulePerformanceModel.king.
radiation_cutoff_threshold : float, optional
Minimum irradiance threshold for calculations. Default is RADIATION_CUTOFF_THRESHOLD.
verbose : int, optional
Level of verbosity for output data. Default is VERBOSE_LEVEL_DEFAULT.
Returns
-------
efficiency_series : np.ndarray
Array of calculated efficiency values for the PV module.
results : dict, optional
Dictionary containing detailed results and intermediate calculations. Provided when `verbose > 0`.
Raises
------
ValueError
If an insufficient number of model constants is provided.
Notes
-----
Currently, external time series of monthly spectral factors are centered in
the beginning of the month and applied plus/minus half a month.
Examples
--------
>>> calculate_pv_efficiency_series([1000, 950], [1.1, 1.05], temperature_series=np.array([25, 26]))
# Returns efficiency series and possibly detailed results based on the verbose level.
"""
temperature_adjusted_series = calculate_photovoltaic_module_temperature_series(
irradiance_series=irradiance_series, # without considering the spectral effect !
photovoltaic_module=photovoltaic_module,
temperature_model=temperature_model,
temperature_series=temperature_series,
wind_speed_series=wind_speed_series,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
effective_irradiance_series = calculate_spectrally_corrected_effective_irradiance(
irradiance_series=irradiance_series,
spectral_factor_series=spectral_factor_series,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
fingerprint=fingerprint,
)
efficiency_series = calculate_efficiency_factor_series(
effective_irradiance_series=effective_irradiance_series.value,
radiation_cutoff_threshold=radiation_cutoff_threshold,
photovoltaic_module=photovoltaic_module,
power_model=power_model,
temperature_series=temperature_adjusted_series, # important !
standard_test_temperature=standard_test_temperature,
dtype=dtype,
array_backend=array_backend,
verbose=verbose,
log=log,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=efficiency_series.value,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return PhotovoltaicModuleEfficiency(
value=efficiency_series.value,
effective_irradiance=effective_irradiance_series,
temperature_adjusted_series=temperature_adjusted_series,
# photovoltaic_module=efficiency_series.photovoltaic_module,
# photovoltaic_module_efficiency_coefficients=efficiency_series.photovoltaic_module_efficiency_coefficients,
# power_model=efficiency_series.power_model,
# radiation_cutoff_threshold=efficiency_series.radiation_cutoff_threshold,
# temperature_model=temperature_model,
)
temperature ¶
Functions:
| Name | Description |
|---|---|
calculate_photovoltaic_module_temperature_series | Note that the irradiance series input should be the uncorrected |
calculate_photovoltaic_module_temperature_series ¶
calculate_photovoltaic_module_temperature_series(
irradiance_series: InclinedIrradiance,
photovoltaic_module: PhotovoltaicModuleModel = CSI_FREE_STANDING,
temperature_model: ModuleTemperatureAlgorithm = faiman,
temperature_series: TemperatureSeries = average_air_temperature,
wind_speed_series: WindSpeedSeries = array(
WIND_SPEED_DEFAULT
),
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Note that the irradiance series input should be the uncorrected irradiance, i.e. not adjusted for spectral effects.
Source code in pvgisprototype/api/power/temperature.py
@log_function_call
def calculate_photovoltaic_module_temperature_series(
irradiance_series: InclinedIrradiance,
photovoltaic_module: PhotovoltaicModuleModel = PhotovoltaicModuleModel.CSI_FREE_STANDING,
temperature_model: ModuleTemperatureAlgorithm = ModuleTemperatureAlgorithm.faiman,
temperature_series: TemperatureSeries = TemperatureSeries().average_air_temperature,
wind_speed_series: WindSpeedSeries = np.array(WIND_SPEED_DEFAULT),
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""
Note that the irradiance series input should be the _uncorrected_
irradiance, i.e. not adjusted for spectral effects.
"""
photovoltaic_module_efficiency_coefficients = (
get_coefficients_for_photovoltaic_module(photovoltaic_module)
)
temperature_adjusted_series = deepcopy(temperature_series) # Safe !
if temperature_model.value == ModuleTemperatureAlgorithm.faiman:
temperature_adjusted_series = calculate_photovoltaic_module_temperature_faiman(
temperature_series=temperature_series,
wind_speed_series=wind_speed_series,
photovoltaic_module_efficiency_coefficients=photovoltaic_module_efficiency_coefficients,
irradiance_series=irradiance_series,
)
# temperature_adjustment_series = temperature_adjusted_series.value - temperature_series.value
# temperature_adjustment_percentage_series = 100 * where(
# temperature_series != 0,
# (temperature_adjusted_series.value - temperature_series.value)
# / (temperature_series.value),
# 0,
# )
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=temperature_adjusted_series,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return temperature_adjusted_series
quick_response_code ¶
Functions:
| Name | Description |
|---|---|
generate_quick_response_code | QUICK_RESPONSE_CODE_MOCKUP = "Position 45.812 8.628, Elevation 214, |
generate_quick_response_code_optimal_surface_position | Generates a quick response code (QR code) containing information about |
generate_quick_response_code ¶
generate_quick_response_code(
dictionary: dict,
longitude: float,
latitude: float,
elevation: float | None = None,
surface_orientation: bool = True,
surface_tilt: bool = True,
timestamps: DatetimeIndex = DatetimeIndex([now()]),
rounding_places: int = ROUNDING_PLACES_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
output_type: QuickResponseCode = Base64,
) -> str | Image | None
QUICK_RESPONSE_CODE_MOCKUP = "Position 45.812 8.628, Elevation 214, Orientation 180, Tilt 0, Start 2005-01-01, End 2020-12-31, Zone UTC, In-Plane Irradiance 22735.7㎾/m², PV Power 17494.6 ㎾, Loss -5241.1 ㎾/m², Time of Min 2005-01-01T00:00, Time of Max 2006-05-30T11:00, Data sources: Irradiance: SARAH2 xxxx, Temperature & Wind Speed: ERA5 xxxx, Spectral factor: PVGIS 2013, Power Model: Huld 2011, Positioning: NOAA Solar Geometry Equations, Incidence angle: Iqbal 1992, Fingerprint: e9bed6970bc502ae912bdbaf792ef694e449063d4bb6ccd77ab9621a045cbf26"
Source code in pvgisprototype/api/quick_response_code.py
def generate_quick_response_code(
dictionary: dict,
longitude: float,
latitude: float,
elevation: float | None = None,
surface_orientation: bool = True,
surface_tilt: bool = True,
timestamps: DatetimeIndex = DatetimeIndex([datetime.now()]),
rounding_places: int = ROUNDING_PLACES_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
output_type: QuickResponseCode = QuickResponseCode.Base64,
) -> str | Image | None:
"""
QUICK_RESPONSE_CODE_MOCKUP = "Position 45.812 8.628, Elevation 214,
Orientation 180, Tilt 0, Start 2005-01-01, End 2020-12-31, Zone UTC,
In-Plane Irradiance 22735.7㎾/m², PV Power 17494.6 ㎾, Loss -5241.1 ㎾/m²,
Time of Min 2005-01-01T00:00, Time of Max 2006-05-30T11:00, Data sources:
Irradiance: SARAH2 xxxx, Temperature & Wind Speed: ERA5 xxxx, Spectral
factor: PVGIS 2013, Power Model: Huld 2011, Positioning: NOAA Solar
Geometry Equations, Incidence angle: Iqbal 1992, Fingerprint:
e9bed6970bc502ae912bdbaf792ef694e449063d4bb6ccd77ab9621a045cbf26"
"""
# Get float values from dictionary
surface_orientation = dictionary.get(SURFACE_ORIENTATION_COLUMN_NAME, "")
surface_tilt = dictionary.get(SURFACE_TILT_COLUMN_NAME, "")
# Get the "frequency" from the timestamps
#time_groupings = {
# "YE": "Yearly",
# "S": "Seasonal",
# "ME": "Monthly",
# "W": "Weekly",
# "D": "Daily",
# "3H": "3-Hourly",
# "H": "Hourly",
#}
frequency = timestamps.freqstr
if timestamps.inferred_freq is None:
frequency = "H"
if timestamps.year.unique().size > 1:
frequency = "YE"
elif timestamps.month.unique().size > 1:
frequency = "ME"
elif timestamps.to_period(frequency).week.unique().size > 1:
frequency = "W"
elif timestamps.day.unique().size > 1:
frequency = "D"
elif timestamps.hour.unique().size < 24:
frequency = "H"
else:
frequency = "3H"
# frequency_label = time_groupings[frequency]
# In order to avoid unbound errors we pre-define `_series` objects
array_parameters = {
"shape": timestamps.shape,
"dtype": dtype,
"init_method": "zeros",
"backend": array_backend,
} # Borrow shape from timestamps
# Process series
inclined_irradiance_series = dictionary.get(
GLOBAL_INCLINED_IRRADIANCE_BEFORE_REFLECTIVITY_COLUMN_NAME, create_array(**array_parameters)
)
inclined_irradiance_mean = calculate_mean_of_series_per_time_unit(
inclined_irradiance_series, timestamps=timestamps, frequency=frequency
)
photovoltaic_power_without_system_loss_series = dictionary.get(
PHOTOVOLTAIC_POWER_WITHOUT_SYSTEM_LOSS_COLUMN_NAME, create_array(**array_parameters)
)
photovoltaic_power_without_system_loss, _ = calculate_sum_and_percentage(
photovoltaic_power_without_system_loss_series,
reference_series=1,
rounding_places=rounding_places,
dtype=dtype,
array_backend=array_backend,
)
photovoltaic_power_without_system_loss_mean = (
calculate_mean_of_series_per_time_unit(
photovoltaic_power_without_system_loss_series,
timestamps=timestamps,
frequency=frequency,
)
)
photovoltaic_power_series = dictionary.get(
PHOTOVOLTAIC_POWER_COLUMN_NAME, create_array(**array_parameters)
)
photovoltaic_power_mean = calculate_mean_of_series_per_time_unit(
photovoltaic_power_series, timestamps=timestamps, frequency=frequency
)
system_efficiency_series = dictionary.get(SYSTEM_EFFICIENCY_COLUMN_NAME, create_array(**array_parameters))
system_efficiency = numpy.nanmedian(system_efficiency_series).astype(
dtype
) # Just in case we ever get time series of `system_efficiency` !
#system_efficiency_change = (
# photovoltaic_power_without_system_loss * system_efficiency
# - photovoltaic_power_without_system_loss
#)
system_efficiency_change_mean = calculate_mean_of_series_per_time_unit(
photovoltaic_power_without_system_loss_mean * system_efficiency
- photovoltaic_power_without_system_loss_mean,
timestamps=timestamps,
frequency=frequency,
)
# Build output string
data = ""
data += "Lat " + str(round_float_values(latitude, rounding_places)) + ", "
data += "Lon " + str(round_float_values(longitude, rounding_places)) + ", "
# data += 'Elevation ' + str(round_float_values(elevation, 0)) + ', '
data += "Elevation " + str(int(elevation)) + ", "
if isinstance(surface_orientation, list):
data += (
"Orientation "
+ ",".join([str(round_float_values(value, rounding_places)) for value in surface_orientation])
+ ", "
)
else:
data += "Orientation " + str(round_float_values(surface_orientation, rounding_places)) + ", "
if isinstance(surface_tilt, list):
data += "Tilt " + ",".join([str(round_float_values(value, rounding_places)) for value in surface_tilt]) + ", "
else:
data += "Tilt " + str(round_float_values(surface_tilt, rounding_places)) + ", "
data += "Start " + str(timestamps[0].strftime("%Y-%m-%d %H:%M")) + ", "
data += "End " + str(timestamps[-1].strftime("%Y-%m-%d %H:%M")) + ", "
data += (
"Irradiance "
+ str(round_float_values(inclined_irradiance_mean, 1))
+ f" {IRRADIANCE_UNIT_K}, "
)
data += (
"Power "
+ str(round_float_values(photovoltaic_power_mean, 1))
+ f" {POWER_UNIT}, "
)
data += (
"Loss "
+ str(round_float_values(system_efficiency_change_mean, 1))
+ f" {IRRADIANCE_UNIT_K}, "
)
# data_source_irradiance = str
# data_source_temperature = str
# data_source_wind_speed = str
# model_source_spectral_factor = 'PVGIS 2013'
# model_photovoltaic_module_performance = 'Huld 2011'
# algorithm_positioning = 'NOAA'
# algorithm_incidence = 'Iqbal 1992'
fingerprint = dictionary.get(FINGERPRINT_COLUMN_NAME, "")
data += f"Fingerprint {fingerprint}, "
from pvgisprototype._version import __version__
data += f"PVGIS v6 ({__version__})"
qr = qrcode.QRCode(
version=1,
error_correction=qrcode.constants.ERROR_CORRECT_L,
box_size=10,
border=4,
)
qr.add_data(data)
qr.make(fit=True)
if output_type == QuickResponseCode.Image:
return qr # needs to .make_image()
if output_type == QuickResponseCode.Base64:
import base64
from io import BytesIO
buffer = BytesIO()
image = qr.make_image()
buffer = BytesIO()
image.save(buffer, format="PNG")
image_bytes = buffer.getvalue()
image_base64 = base64.b64encode(image_bytes).decode("utf-8")
return image_base64
return None
generate_quick_response_code_optimal_surface_position ¶
generate_quick_response_code_optimal_surface_position(
dictionary: dict,
longitude: float,
latitude: float,
elevation: float | None = None,
surface_orientation: (
float | SurfaceOrientation | None
) = None,
surface_tilt: float | SurfaceTilt | None = None,
timestamps: DatetimeIndex = DatetimeIndex([now()]),
rounding_places: int = ROUNDING_PLACES_DEFAULT,
output_type: QuickResponseCode = Base64,
) -> str | Image | None
Generates a quick response code (QR code) containing information about the optimal surface position for photovoltaic performance.
This function creates a QR code based on the provided geographical and photovoltaic data. It includes information such as latitude, longitude, elevation, optimal orientation and tilt, timestamps, mean photovoltaic power, and fingerprint. The output can be in Base64 format or as an image object.
Args: dictionary (dict): A dictionary containing photovoltaic data. longitude (float): The longitude of the location. latitude (float): The latitude of the location. elevation (float | None, optional): The elevation of the location. surface_orientation (float | SurfaceOrientation | None, optional): The surface orientation value or object. surface_tilt (float | SurfaceTilt | None, optional): The surface tilt value or object. timestamps (DatetimeIndex, optional): A pandas DatetimeIndex of timestamps. Defaults to the current datetime. rounding_places (int, optional): Number of decimal places for rounding. Defaults to ROUNDING_PLACES_DEFAULT. output_type (QuickResponseCode, optional): The type of QR code to output. Defaults to QuickResponseCode.Base64.
Returns: str | Image | None: The QR code as a Base64 string, an image object, or None if the output type is not recognized.
Source code in pvgisprototype/api/quick_response_code.py
def generate_quick_response_code_optimal_surface_position(
dictionary: dict,
longitude: float,
latitude: float,
elevation: float | None = None,
surface_orientation: float | SurfaceOrientation | None = None,
surface_tilt: float | SurfaceTilt | None = None,
timestamps: DatetimeIndex = DatetimeIndex([datetime.now()]),
rounding_places: int = ROUNDING_PLACES_DEFAULT,
output_type: QuickResponseCode = QuickResponseCode.Base64,
) -> str | Image | None:
"""
Generates a quick response code (QR code) containing information about
the optimal surface position for photovoltaic performance.
This function creates a QR code based on the provided geographical and
photovoltaic data. It includes information such as latitude, longitude,
elevation, optimal orientation and tilt, timestamps, mean photovoltaic
power, and fingerprint. The output can be in Base64 format or as an
image object.
Args:
dictionary (dict): A dictionary containing photovoltaic data.
longitude (float): The longitude of the location.
latitude (float): The latitude of the location.
elevation (float | None, optional): The elevation of the location.
surface_orientation (float | SurfaceOrientation | None, optional):
The surface orientation value or object.
surface_tilt (float | SurfaceTilt | None, optional): The surface
tilt value or object.
timestamps (DatetimeIndex, optional): A pandas DatetimeIndex of
timestamps. Defaults to the current datetime.
rounding_places (int, optional): Number of decimal places for
rounding. Defaults to ROUNDING_PLACES_DEFAULT.
output_type (QuickResponseCode, optional): The type of QR code to
output. Defaults to QuickResponseCode.Base64.
Returns:
str | Image | None: The QR code as a Base64 string, an image object,
or None if the output type is not recognized.
"""
# Get float values from dictionary
surface_orientation = dictionary.get(SURFACE_ORIENTATION_NAME, "")
surface_tilt = dictionary.get(SURFACE_TILT_NAME, "")
mean_photovoltaic_power = dictionary.get(MEAN_PHOTOVOLTAIC_POWER_NAME, None)
# Build output string
data = ""
data += "Lat " + str(round_float_values(latitude, rounding_places)) + ", "
data += "Lon " + str(round_float_values(longitude, rounding_places)) + ", "
# data += 'Elevation ' + str(round_float_values(elevation, 0)) + ', '
if elevation is not None:
data += "Elevation " + str(int(elevation)) + ", "
if surface_orientation.optimal:
data += "Optimal Orientation " + str(round_float_values(surface_orientation.value, rounding_places)) + ", "
else:
data += "Orientation " + str(round_float_values(surface_orientation.value, rounding_places)) + ", "
if surface_tilt.optimal:
data += "Optimal Tilt " + str(round_float_values(surface_tilt.value, rounding_places)) + ", "
else:
data += "Tilt " + str(round_float_values(surface_tilt.value, rounding_places)) + ", "
data += "Start " + str(timestamps[0].strftime("%Y-%m-%d %H:%M")) + ", "
data += "End " + str(timestamps[-1].strftime("%Y-%m-%d %H:%M")) + ", "
data += (
"Mean Photovoltaic Power "
+ str(round_float_values(mean_photovoltaic_power, 1))
+ f" {POWER_UNIT}, "
)
fingerprint = dictionary.get(FINGERPRINT_COLUMN_NAME, "")
data += f"Fingerprint {fingerprint}, "
from pvgisprototype._version import __version__
data += f"PVGIS v6 ({__version__})"
qr = qrcode.QRCode(
version=1,
error_correction=qrcode.constants.ERROR_CORRECT_L,
box_size=10,
border=4,
)
qr.add_data(data)
qr.make(fit=True)
if output_type == QuickResponseCode.Image:
return qr # needs to .make_image()
if output_type == QuickResponseCode.Base64:
import base64
from io import BytesIO
buffer = BytesIO()
image = qr.make_image()
buffer = BytesIO()
image.save(buffer, format="PNG")
image_bytes = buffer.getvalue()
image_base64 = base64.b64encode(image_bytes).decode("utf-8")
return image_base64
return None
series ¶
Modules:
| Name | Description |
|---|---|
csv | Multi-threaded CSV writer, much faster than :meth: |
direct_horizontal_irradiance | |
direct_horizontal_irradiance_ | |
global_horizontal_irradiance | |
global_horizontal_irradiance_ | |
hardcodings | Hardcodings |
horizon_profile | |
horizontal_irradiance | |
open | |
plot | |
relative_humidity | |
select | |
spectral_factor | |
temperature | |
time_series | |
utilities | |
wind_speed | |
csv ¶
Multi-threaded CSV writer, much faster than :meth:pandas.DataFrame.to_csv, with full support for dask <http://dask.org/>_ and dask distributed <http://distributed.dask.org/>_.
Functions:
| Name | Description |
|---|---|
to_csv | Print DataArray to CSV. |
to_csv ¶
Print DataArray to CSV.
When x has numpy backend, this function is functionally equivalent to (but much) faster than)::
x.to_pandas().to_csv(path_or_buf, **kwargs)
When x has dask backend, this function returns a dask delayed object which will write to the disk only when its .compute() method is invoked.
Formatting and optional compression are parallelised across all available CPUs, using one dask task per chunk on the first dimension. Chunks on other dimensions will be merged ahead of computation.
:param x: :class:~xarray.DataArray with one or two dimensions :param str path: Output file path :param bool nogil: If True, use accelerated C implementation. Several kwargs won't be processed correctly (see limitations below). If False, use pandas to_csv method (slow, and does not release the GIL). nogil=True exclusively supports float and integer values dtypes (but the coords can be anything). In case of incompatible dtype, nogil is automatically switched to False. :param kwargs: Passed verbatim to :meth:pandas.DataFrame.to_csv or :meth:pandas.Series.to_csv
Limitations
- Fancy URIs are not (yet) supported.
- compression='zip' is not supported. All other compression methods (gzip, bz2, xz) are supported.
- When running with nogil=True, the following parameters are ignored: columns, quoting, quotechar, doublequote, escapechar, chunksize, decimal
Distributed computing
This function supports dask distributed_, with the caveat that all workers must write to the same shared mountpoint and that the shared filesystem must strictly guarantee close-open coherency, meaning that one must be able to call write() and then close() on a file descriptor from one host and then immediately afterwards open() from another host and see the output from the first host. Note that, for performance reasons, most network filesystems do not enable this feature by default.
Alternatively, one may write to local mountpoints and then manually collect and concatenate the partial outputs.
Source code in pvgisprototype/api/series/csv.py
def to_csv(x: xarray.DataArray, path: str | Path, *, nogil: bool = True, **kwargs):
"""Print DataArray to CSV.
When x has numpy backend, this function is functionally equivalent to (but
much) faster than)::
x.to_pandas().to_csv(path_or_buf, **kwargs)
When x has dask backend, this function returns a dask delayed object which
will write to the disk only when its .compute() method is invoked.
Formatting and optional compression are parallelised across all available
CPUs, using one dask task per chunk on the first dimension. Chunks on other
dimensions will be merged ahead of computation.
:param x:
:class:`~xarray.DataArray` with one or two dimensions
:param str path:
Output file path
:param bool nogil:
If True, use accelerated C implementation. Several kwargs won't be
processed correctly (see limitations below). If False, use pandas
to_csv method (slow, and does not release the GIL).
nogil=True exclusively supports float and integer values dtypes (but
the coords can be anything). In case of incompatible dtype, nogil
is automatically switched to False.
:param kwargs:
Passed verbatim to :meth:`pandas.DataFrame.to_csv` or
:meth:`pandas.Series.to_csv`
**Limitations**
- Fancy URIs are not (yet) supported.
- compression='zip' is not supported. All other compression methods (gzip,
bz2, xz) are supported.
- When running with nogil=True, the following parameters are ignored:
columns, quoting, quotechar, doublequote, escapechar, chunksize, decimal
**Distributed computing**
This function supports `dask distributed`_, with the caveat that all workers
must write to the same shared mountpoint and that the shared filesystem
must strictly guarantee **close-open coherency**, meaning that one must be
able to call write() and then close() on a file descriptor from one host
and then immediately afterwards open() from another host and see the output
from the first host. Note that, for performance reasons, most network
filesystems do not enable this feature by default.
Alternatively, one may write to local mountpoints and then manually collect
and concatenate the partial outputs.
"""
if not isinstance(x, xarray.DataArray):
raise ValueError("first argument must be a DataArray")
# Health checks
if not isinstance(path, Path):
try:
path = Path(path)
except Exception as e:
raise ValueError(f"{e} : `path_or_buf` must be a file path")
if x.ndim not in (1, 2):
raise ValueError(
"cannot convert arrays with %d dimensions into " "pandas objects" % x.ndim
)
if nogil and x.dtype.kind not in "if":
nogil = False
# Extract row and columns indices
indices = [x.get_index(dim) for dim in x.dims]
if x.ndim == 2:
index, columns = indices
else:
index = indices[0]
columns = None
compression = kwargs.pop("compression", "infer")
compress = _compress_func(path, compression)
mode = kwargs.pop("mode", "w")
if mode not in "wa":
raise ValueError('mode: expected w or a; got "%s"' % mode)
# Fast exit for numpy backend
if not x.chunks:
bdata = kernels.to_csv(x.values, index, columns, True, nogil, kwargs)
if compress:
bdata = compress(bdata)
with open(path, mode + "b") as fh:
fh.write(bdata)
return None
# Merge chunks on all dimensions beyond the first
x = x.chunk((x.chunks[0],) + tuple((s,) for s in x.shape[1:]))
# Manually define the dask graph
tok = tokenize(x.data, index, columns, compression, path, kwargs)
name1 = "to_csv_encode-" + tok
name2 = "to_csv_compress-" + tok
name3 = "to_csv_write-" + tok
name4 = "to_csv-" + tok
dsk: dict[str | tuple, tuple] = {}
assert x.chunks
assert x.chunks[0]
offset = 0
for i, size in enumerate(x.chunks[0]):
# Slice index
index_i = index[offset : offset + size]
offset += size
x_i = (x.data.name, i) + (0,) * (x.ndim - 1)
# Step 1: convert to CSV and encode to binary blob
if i == 0:
# First chunk: print header
dsk[name1, i] = (kernels.to_csv, x_i, index_i, columns, True, nogil, kwargs)
else:
kwargs_i = kwargs.copy()
kwargs_i["header"] = False
dsk[name1, i] = (kernels.to_csv, x_i, index_i, None, False, nogil, kwargs_i)
# Step 2 (optional): compress
if compress:
prevname = name2
dsk[name2, i] = compress, (name1, i)
else:
prevname = name1
# Step 3: write to file
if i == 0:
# First chunk: overwrite file if it already exists
dsk[name3, i] = kernels.to_file, path, mode + "b", (prevname, i)
else:
# Next chunks: wait for previous chunk to complete and append
dsk[name3, i] = (kernels.to_file, path, "ab", (prevname, i), (name3, i - 1))
# Rename final key
dsk[name4] = dsk.pop((name3, i))
hlg = HighLevelGraph.from_collections(name4, dsk, (x,))
return Delayed(name4, hlg)
direct_horizontal_irradiance ¶
Functions:
| Name | Description |
|---|---|
get_direct_horizontal_irradiance_series | |
get_direct_horizontal_irradiance_series_from_array_or_set | Extract direct horizontal irradiance time series from xarray DataArray or Dataset. |
get_direct_horizontal_irradiance_series ¶
get_direct_horizontal_irradiance_series(
longitude: float,
latitude: float,
timestamps: DatetimeIndex = str(now()),
direct_horizontal_irradiance_series: Path | None = None,
neighbor_lookup: (
MethodForInexactMatches | None
) = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/series/direct_horizontal_irradiance.py
def get_direct_horizontal_irradiance_series(
longitude: float,
latitude: float,
timestamps: DatetimeIndex = str(Timestamp.now()),
direct_horizontal_irradiance_series: Path | None = None,
neighbor_lookup: MethodForInexactMatches | None = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
""" """
if isinstance(direct_horizontal_irradiance_series, Path):
from pvgisprototype.api.series.select import select_time_series
from pvgisprototype.api.utilities.conversions import (
convert_float_to_degrees_if_requested,
)
from pvgisprototype.constants import DEGREES
direct_horizontal_irradiance_series = (
select_time_series(
time_series=direct_horizontal_irradiance_series,
longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if direct_horizontal_irradiance_series.size == 1 and direct_horizontal_irradiance_series.shape == (): # type: ignore[union-attr]
direct_horizontal_irradiance_series = array( # type: ignore[assignment]
[direct_horizontal_irradiance_series], dtype=dtype
)
return direct_horizontal_irradiance_series
get_direct_horizontal_irradiance_series_from_array_or_set ¶
get_direct_horizontal_irradiance_series_from_array_or_set(
longitude: float,
latitude: float,
direct_horizontal_irradiance_series: (
DataArray | Dataset
),
timestamps: DatetimeIndex = str(now()),
neighbor_lookup: (
MethodForInexactMatches | None
) = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Extract direct horizontal irradiance time series from xarray DataArray or Dataset.
Selects and extracts direct horizontal irradiance data for a specific geographic location and time period from an xarray DataArray or Dataset. Performs spatial interpolation using the specified neighbor lookup method and temporal selection based on the provided timestamps.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
longitude | float | Longitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
latitude | float | Latitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
direct_horizontal_irradiance_series | DataArray | Dataset | Input xarray DataArray or Dataset containing direct horizontal irradiance data with spatial (longitude, latitude) and temporal dimensions. | required |
timestamps | DatetimeIndex | Time index for temporal selection of the data, by default str(Timestamp.now()) | str(now()) |
neighbor_lookup | MethodForInexactMatches | None | Method for spatial interpolation when exact coordinate matches are not found, by default NEIGHBOR_LOOKUP_DEFAULT | NEIGHBOR_LOOKUP_DEFAULT |
tolerance | float | None | Maximum distance tolerance for spatial interpolation, by default TOLERANCE_DEFAULT | TOLERANCE_DEFAULT |
dtype | str | Data type for the output numpy array, by default DATA_TYPE_DEFAULT | DATA_TYPE_DEFAULT |
log | int | Logging level for debug output, by default LOG_LEVEL_DEFAULT | LOG_LEVEL_DEFAULT |
Returns:
| Type | Description |
|---|---|
ndarray | Direct horizontal irradiance time series as a 1D numpy array with the specified dtype. Single scalar values are automatically converted to 1D arrays with one element. |
Raises:
| Type | Description |
|---|---|
TypeError | If direct_horizontal_irradiance_series is not a DataArray or Dataset. |
Notes
The function automatically handles coordinate conversion to ensure compatibility with the underlying data. Scalar results are converted to 1D arrays for consistency in downstream processing.
Source code in pvgisprototype/api/series/direct_horizontal_irradiance.py
def get_direct_horizontal_irradiance_series_from_array_or_set(
longitude: float,
latitude: float,
direct_horizontal_irradiance_series: DataArray | Dataset,
timestamps: DatetimeIndex = str(Timestamp.now()),
neighbor_lookup: MethodForInexactMatches | None = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""Extract direct horizontal irradiance time series from xarray DataArray or Dataset.
Selects and extracts direct horizontal irradiance data for a specific geographic
location and time period from an xarray DataArray or Dataset. Performs spatial
interpolation using the specified neighbor lookup method and temporal selection
based on the provided timestamps.
Parameters
----------
longitude : float
Longitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
latitude : float
Latitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
direct_horizontal_irradiance_series : DataArray | Dataset
Input xarray DataArray or Dataset containing direct horizontal irradiance data
with spatial (longitude, latitude) and temporal dimensions.
timestamps : DatetimeIndex, optional
Time index for temporal selection of the data,
by default str(Timestamp.now())
neighbor_lookup : MethodForInexactMatches | None, optional
Method for spatial interpolation when exact coordinate matches are not found,
by default NEIGHBOR_LOOKUP_DEFAULT
tolerance : float | None, optional
Maximum distance tolerance for spatial interpolation,
by default TOLERANCE_DEFAULT
dtype : str, optional
Data type for the output numpy array,
by default DATA_TYPE_DEFAULT
log : int, optional
Logging level for debug output,
by default LOG_LEVEL_DEFAULT
Returns
-------
numpy.ndarray
Direct horizontal irradiance time series as a 1D numpy array with the
specified dtype. Single scalar values are automatically converted to
1D arrays with one element.
Raises
------
TypeError
If direct_horizontal_irradiance_series is not a DataArray or Dataset.
Notes
-----
The function automatically handles coordinate conversion to ensure compatibility
with the underlying data. Scalar results are converted to 1D arrays for
consistency in downstream processing.
"""
if isinstance(direct_horizontal_irradiance_series, DataArray | Dataset):
from pvgisprototype.api.utilities.conversions import (
convert_float_to_degrees_if_requested,
)
from pvgisprototype.constants import DEGREES
direct_horizontal_irradiance_time_series = (
select_time_series_from_array_or_set(
data=direct_horizontal_irradiance_series,
longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if (
direct_horizontal_irradiance_time_series.size == 1
and direct_horizontal_irradiance_time_series.shape == ()
):
direct_horizontal_irradiance_time_series = array(
[direct_horizontal_irradiance_time_series], dtype=dtype
)
else:
raise TypeError(
"Direct horizontal irradiance series must be a DataArray or Dataset."
)
return direct_horizontal_irradiance_time_series
direct_horizontal_irradiance_ ¶
Functions:
| Name | Description |
|---|---|
get_direct_horizontal_irradiance_series_from_array_or_set | |
get_direct_horizontal_irradiance_series_from_array_or_set ¶
get_direct_horizontal_irradiance_series_from_array_or_set(
longitude: float,
latitude: float,
timestamps: DatetimeIndex = str(now()),
direct_horizontal_irradiance_series: Path | None = None,
neighbor_lookup: (
MethodForInexactMatches | None
) = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/series/direct_horizontal_irradiance_.py
def get_direct_horizontal_irradiance_series_from_array_or_set(
longitude: float,
latitude: float,
timestamps: DatetimeIndex = str(Timestamp.now()),
direct_horizontal_irradiance_series: Path | None = None,
neighbor_lookup: MethodForInexactMatches | None = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
""" """
if isinstance(direct_horizontal_irradiance_series, Path):
from pvgisprototype.api.series.select import select_time_series
from pvgisprototype.constants import DEGREES
from pvgisprototype.api.utilities.conversions import (
convert_float_to_degrees_if_requested,
)
direct_horizontal_irradiance_series = (
select_time_series(
time_series=direct_horizontal_irradiance_series,
longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if direct_horizontal_irradiance_series.size == 1 and direct_horizontal_irradiance_series.shape == (): # type: ignore[union-attr]
direct_horizontal_irradiance_series = array( # type: ignore[assignment]
[direct_horizontal_irradiance_series], dtype=dtype
)
return direct_horizontal_irradiance_series
global_horizontal_irradiance ¶
Functions:
| Name | Description |
|---|---|
get_global_horizontal_irradiance_series | |
get_global_horizontal_irradiance_series_from_array_or_set | Extract global horizontal irradiance time series from xarray DataArray or Dataset. |
get_global_horizontal_irradiance_series ¶
get_global_horizontal_irradiance_series(
longitude: float,
latitude: float,
timestamps: DatetimeIndex = str(now()),
global_horizontal_irradiance_series: Path | None = None,
neighbor_lookup: (
MethodForInexactMatches | None
) = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/series/global_horizontal_irradiance.py
def get_global_horizontal_irradiance_series(
longitude: float,
latitude: float,
timestamps: DatetimeIndex = str(Timestamp.now()),
global_horizontal_irradiance_series: Path | None = None,
neighbor_lookup: MethodForInexactMatches | None = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
""" """
if isinstance(global_horizontal_irradiance_series, Path):
from pvgisprototype.api.series.select import select_time_series
from pvgisprototype.api.utilities.conversions import (
convert_float_to_degrees_if_requested,
)
from pvgisprototype.constants import DEGREES
global_horizontal_irradiance_series = (
select_time_series(
time_series=global_horizontal_irradiance_series,
longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if global_horizontal_irradiance_series.size == 1 and global_horizontal_irradiance_series.shape == (): # type: ignore[union-attr]
global_horizontal_irradiance_series = array( # type: ignore[assignment]
[global_horizontal_irradiance_series], dtype=dtype
)
return global_horizontal_irradiance_series
get_global_horizontal_irradiance_series_from_array_or_set ¶
get_global_horizontal_irradiance_series_from_array_or_set(
longitude: float,
latitude: float,
global_horizontal_irradiance_series: (
DataArray | Dataset
),
timestamps: DatetimeIndex = str(now()),
neighbor_lookup: (
MethodForInexactMatches | None
) = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Extract global horizontal irradiance time series from xarray DataArray or Dataset.
Selects and extracts global horizontal irradiance data for a specific geographic location and time period from an xarray DataArray or Dataset. Performs spatial interpolation using the specified neighbor lookup method and temporal selection based on the provided timestamps.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
longitude | float | Longitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
latitude | float | Latitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
global_horizontal_irradiance_series | DataArray | Dataset | Input xarray DataArray or Dataset containing global horizontal irradiance data with spatial (longitude, latitude) and temporal dimensions. | required |
timestamps | DatetimeIndex | Time index for temporal selection of the data, by default str(Timestamp.now()) | str(now()) |
neighbor_lookup | MethodForInexactMatches | None | Method for spatial interpolation when exact coordinate matches are not found, by default NEIGHBOR_LOOKUP_DEFAULT | NEIGHBOR_LOOKUP_DEFAULT |
tolerance | float | None | Maximum distance tolerance for spatial interpolation, by default TOLERANCE_DEFAULT | TOLERANCE_DEFAULT |
dtype | str | Data type for the output numpy array, by default DATA_TYPE_DEFAULT | DATA_TYPE_DEFAULT |
log | int | Logging level for debug output, by default LOG_LEVEL_DEFAULT | LOG_LEVEL_DEFAULT |
Returns:
| Type | Description |
|---|---|
ndarray | Global horizontal irradiance time series as a 1D numpy array with the specified dtype. Single scalar values are automatically converted to 1D arrays with one element. |
Raises:
| Type | Description |
|---|---|
TypeError | If global_horizontal_irradiance_series is not a DataArray or Dataset. |
Notes
The function automatically handles coordinate conversion to ensure compatibility with the underlying data. Scalar results are converted to 1D arrays for consistency in downstream processing.
Source code in pvgisprototype/api/series/global_horizontal_irradiance.py
def get_global_horizontal_irradiance_series_from_array_or_set(
longitude: float,
latitude: float,
global_horizontal_irradiance_series: DataArray | Dataset,
timestamps: DatetimeIndex = str(Timestamp.now()),
neighbor_lookup: MethodForInexactMatches | None = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""Extract global horizontal irradiance time series from xarray DataArray or Dataset.
Selects and extracts global horizontal irradiance data for a specific geographic
location and time period from an xarray DataArray or Dataset. Performs spatial
interpolation using the specified neighbor lookup method and temporal selection
based on the provided timestamps.
Parameters
----------
longitude : float
Longitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
latitude : float
Latitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
global_horizontal_irradiance_series : DataArray | Dataset
Input xarray DataArray or Dataset containing global horizontal irradiance data
with spatial (longitude, latitude) and temporal dimensions.
timestamps : DatetimeIndex, optional
Time index for temporal selection of the data,
by default str(Timestamp.now())
neighbor_lookup : MethodForInexactMatches | None, optional
Method for spatial interpolation when exact coordinate matches are not found,
by default NEIGHBOR_LOOKUP_DEFAULT
tolerance : float | None, optional
Maximum distance tolerance for spatial interpolation,
by default TOLERANCE_DEFAULT
dtype : str, optional
Data type for the output numpy array,
by default DATA_TYPE_DEFAULT
log : int, optional
Logging level for debug output,
by default LOG_LEVEL_DEFAULT
Returns
-------
numpy.ndarray
Global horizontal irradiance time series as a 1D numpy array with the
specified dtype. Single scalar values are automatically converted to
1D arrays with one element.
Raises
------
TypeError
If global_horizontal_irradiance_series is not a DataArray or Dataset.
Notes
-----
The function automatically handles coordinate conversion to ensure compatibility
with the underlying data. Scalar results are converted to 1D arrays for
consistency in downstream processing.
"""
if isinstance(global_horizontal_irradiance_series, DataArray | Dataset):
from pvgisprototype.api.utilities.conversions import (
convert_float_to_degrees_if_requested,
)
from pvgisprototype.constants import DEGREES
global_horizontal_irradiance_time_series = (
select_time_series_from_array_or_set(
data=global_horizontal_irradiance_series,
longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if (
global_horizontal_irradiance_time_series.size == 1
and global_horizontal_irradiance_time_series.shape == ()
):
global_horizontal_irradiance_time_series = array(
[global_horizontal_irradiance_time_series], dtype=dtype
)
else:
raise TypeError(
"Global horizontal irradiance series must be a DataArray or Dataset."
)
return global_horizontal_irradiance_time_series
global_horizontal_irradiance_ ¶
Functions:
| Name | Description |
|---|---|
get_global_horizontal_irradiance_series_from_array_or_set | |
get_global_horizontal_irradiance_series_from_array_or_set ¶
get_global_horizontal_irradiance_series_from_array_or_set(
longitude: float,
latitude: float,
timestamps: DatetimeIndex = str(now()),
global_horizontal_irradiance_series: Path | None = None,
neighbor_lookup: (
MethodForInexactMatches | None
) = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/series/global_horizontal_irradiance_.py
def get_global_horizontal_irradiance_series_from_array_or_set(
longitude: float,
latitude: float,
timestamps: DatetimeIndex = str(Timestamp.now()),
global_horizontal_irradiance_series: Path | None = None,
neighbor_lookup: MethodForInexactMatches | None = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
""" """
if isinstance(global_horizontal_irradiance_series, Path):
from pvgisprototype.api.series.select import select_time_series
from pvgisprototype.constants import DEGREES
from pvgisprototype.api.utilities.conversions import (
convert_float_to_degrees_if_requested,
)
global_horizontal_irradiance_series = (
select_time_series(
time_series=global_horizontal_irradiance_series,
longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if global_horizontal_irradiance_series.size == 1 and global_horizontal_irradiance_series.shape == (): # type: ignore[union-attr]
global_horizontal_irradiance_series = array( # type: ignore[assignment]
[global_horizontal_irradiance_series], dtype=dtype
)
return global_horizontal_irradiance_series
hardcodings ¶
Hardcodings
horizon_profile ¶
Functions:
| Name | Description |
|---|---|
get_horizon_profile_from_array_or_set | Extract horizon profile values from xarray DataArray or Dataset. |
get_horizon_profile_from_array_or_set ¶
get_horizon_profile_from_array_or_set(
longitude: float,
latitude: float,
horizon_profile: DataArray | Dataset,
neighbor_lookup: (
MethodForInexactMatches | None
) = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Extract horizon profile values from xarray DataArray or Dataset.
Selects and extracts horizon profile data for a specific geographic location and time period from an xarray DataArray or Dataset. Performs spatial interpolation using the specified neighbor lookup method and temporal selection based on the provided timestamps.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
longitude | float | Longitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
latitude | float | Latitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
horizon_profile | DataArray | Dataset | Input xarray DataArray or Dataset containing horizon profile data with spatial (longitude, latitude) and azimuth dimensions. | required |
neighbor_lookup | MethodForInexactMatches | None | Method for spatial interpolation when exact coordinate matches are not found, by default NEIGHBOR_LOOKUP_DEFAULT | NEIGHBOR_LOOKUP_DEFAULT |
tolerance | float | None | Maximum distance tolerance for spatial interpolation, by default TOLERANCE_DEFAULT | TOLERANCE_DEFAULT |
dtype | str | Data type for the output numpy array, by default DATA_TYPE_DEFAULT | DATA_TYPE_DEFAULT |
log | int | Logging level for debug output, by default LOG_LEVEL_DEFAULT | LOG_LEVEL_DEFAULT |
Returns:
| Type | Description |
|---|---|
ndarray | Direct horizontal irradiance time series as a 1D numpy array with the specified dtype. Single scalar values are automatically converted to 1D arrays with one element. |
Raises:
| Type | Description |
|---|---|
TypeError | If direct_horizontal_irradiance_series is not a DataArray or Dataset. |
Notes
The function automatically handles coordinate conversion to ensure compatibility with the underlying data. Scalar results are converted to 1D arrays for consistency in downstream processing.
Source code in pvgisprototype/api/series/horizon_profile.py
def get_horizon_profile_from_array_or_set(
longitude: float,
latitude: float,
horizon_profile: DataArray | Dataset,
neighbor_lookup: MethodForInexactMatches | None = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""Extract horizon profile values from xarray DataArray or Dataset.
Selects and extracts horizon profile data for a specific geographic
location and time period from an xarray DataArray or Dataset. Performs spatial
interpolation using the specified neighbor lookup method and temporal selection
based on the provided timestamps.
Parameters
----------
longitude : float
Longitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
latitude : float
Latitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
horizon_profile : DataArray | Dataset
Input xarray DataArray or Dataset containing horizon profile data
with spatial (longitude, latitude) and azimuth dimensions.
neighbor_lookup : MethodForInexactMatches | None, optional
Method for spatial interpolation when exact coordinate matches are not found,
by default NEIGHBOR_LOOKUP_DEFAULT
tolerance : float | None, optional
Maximum distance tolerance for spatial interpolation,
by default TOLERANCE_DEFAULT
dtype : str, optional
Data type for the output numpy array,
by default DATA_TYPE_DEFAULT
log : int, optional
Logging level for debug output,
by default LOG_LEVEL_DEFAULT
Returns
-------
numpy.ndarray
Direct horizontal irradiance time series as a 1D numpy array with the
specified dtype. Single scalar values are automatically converted to
1D arrays with one element.
Raises
------
TypeError
If direct_horizontal_irradiance_series is not a DataArray or Dataset.
Notes
-----
The function automatically handles coordinate conversion to ensure compatibility
with the underlying data. Scalar results are converted to 1D arrays for
consistency in downstream processing.
"""
if isinstance(horizon_profile, DataArray | Dataset):
from pvgisprototype.api.utilities.conversions import (
convert_float_to_degrees_if_requested,
)
from pvgisprototype.constants import DEGREES
horizon_profile_series = select_time_series_from_array_or_set(
data=horizon_profile,
longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
timestamps=None,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
verbose=0, # no verbosity here by choice!
log=log,
)
return horizon_profile_series
horizontal_irradiance ¶
Functions:
| Name | Description |
|---|---|
read_horizontal_irradiance_components_from_sarah | Read horizontal irradiance components from SARAH time series. |
read_horizontal_irradiance_components_from_sarah ¶
read_horizontal_irradiance_components_from_sarah(
shortwave: Path | None,
direct: Path | None,
longitude: float,
latitude: float,
timestamps: DatetimeIndex | None = DatetimeIndex(
[now(tz="UTC")]
),
neighbor_lookup: MethodForInexactMatches = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = False,
in_memory: bool = False,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
) -> tuple[ndarray, ndarray]
Read horizontal irradiance components from SARAH time series.
Read the global and direct horizontal irradiance components incident on a solar surface from SARAH time series.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
shortwave | Path | None | Filename of surface short-wave (solar) radiation downwards time series (short name : | required |
Returns:
| Type | Description |
|---|---|
tuple(GlobalHorizontalIrradiance, DirectHorizontalIrradiance) | |
Source code in pvgisprototype/api/series/horizontal_irradiance.py
@log_function_call
def read_horizontal_irradiance_components_from_sarah(
shortwave: Path | None,
direct: Path | None,
longitude: float,
latitude: float,
timestamps: DatetimeIndex | None = DatetimeIndex([Timestamp.now(tz='UTC')]),
neighbor_lookup: MethodForInexactMatches = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = False,
in_memory: bool = False,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
) -> tuple[numpy.ndarray, numpy.ndarray]:
"""Read horizontal irradiance components from SARAH time series.
Read the global and direct horizontal irradiance components incident on a
solar surface from SARAH time series.
Parameters
----------
shortwave: Path
Filename of surface short-wave (solar) radiation downwards time series
(short name : `ssrd`) from ECMWF which is the solar radiation that
reaches a horizontal plane at the surface of the Earth. This parameter
comprises both direct and diffuse solar radiation.
Returns
-------
tuple(GlobalHorizontalIrradiance, DirectHorizontalIrradiance)
"""
if verbose > 0:
logger.info(
":information: Reading the global and direct horizontal irradiance components from external data ...",
alt=f":information: [black on white][bold]Reading[/bold] the [orange]global[/orange] and [yellow]direct[/yellow] horizontal irradiance components [bold]from external data[/bold] ...[/black on white]",
)
if multi_thread:
from concurrent.futures import ThreadPoolExecutor
with ThreadPoolExecutor(max_workers=2) as executor:
future_global_horizontal_irradiance_series = executor.submit(
select_time_series,
time_series=shortwave,
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=0,
log=log,
)
future_direct_horizontal_irradiance_series = executor.submit(
select_time_series,
time_series=direct,
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=0,
log=log,
)
global_horizontal_irradiance_series = (
future_global_horizontal_irradiance_series.result()
.to_numpy()
.astype(dtype=dtype)
)
direct_horizontal_irradiance_series = (
future_direct_horizontal_irradiance_series.result()
.to_numpy()
.astype(dtype=dtype)
)
else:
global_horizontal_irradiance_series = (
select_time_series(
time_series=shortwave,
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=verbose,
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
direct_horizontal_irradiance_series = (
select_time_series(
time_series=direct,
longitude=longitude,
latitude=latitude,
timestamps=timestamps,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=verbose,
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
return global_horizontal_irradiance_series, direct_horizontal_irradiance_series
open ¶
Functions:
| Name | Description |
|---|---|
filter_xarray | Filter a Dataset or DataArray based on a given coordinate with specified minimum and/or |
get_scale_and_offset | Get scale and offset values from a netCDF file using xarray |
load_or_open_dataarray_from_dataset | Load or open a variable from a dataset and select coordinates. |
open_data_array | |
open_data_set | Open or load a dataset based on the input flags. |
open_xarray_supported_time_series_data | Select location series |
read_data_array_or_set | Open the data and determine if it's a DataArray or Dataset. |
select_coordinates | Select single pair of coordinates from a data array |
select_location_time_series | Select a location from a time series data format supported by |
select_location_time_series_from_array_or_set | Select location-specific time series data from xarray Dataset or DataArray. |
set_location_indexers | Select single pair of coordinates from a data array |
filter_xarray ¶
filter_xarray(
data: Dataset | DataArray,
coordinate: str,
minimum: float | None,
maximum: float | None,
drop: bool = True,
) -> Dataset | DataArray
Filter a Dataset or DataArray based on a given coordinate with specified minimum and/or maximum values. If the minimum or maximum is None, the function will ignore that bound.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
data | Dataset | DataArray | The input xarray Dataset or DataArray to filter. | required |
coordinate | str | The name of the coordinate within the Dataset or DataArray to apply the filter on. | required |
minimum | float or None | The minimum value for the coordinate. If None, no lower bound is applied. | required |
maximum | float or None | The maximum value for the coordinate. If None, no upper bound is applied. | required |
drop | bool | Whether to drop values that fall outside the range, by default True. | True |
Returns:
| Type | Description |
|---|---|
Dataset | DataArray | The filtered xarray Dataset or DataArray, where values outside the [minimum, maximum] range are dropped or masked. |
Raises:
| Type | Description |
|---|---|
ValueError | If the coordinate is not present in the input data. |
Notes
- If both
minimumandmaximumare None, the input data is returned unfiltered. - Emits a warning via logger if any values exceed the bounds.
Source code in pvgisprototype/api/series/open.py
def filter_xarray(
data: Dataset | DataArray,
coordinate: str,
minimum: float | None,
maximum: float | None,
drop: bool = True,
) -> Dataset | DataArray:
"""
Filter a Dataset or DataArray based on a given coordinate with specified minimum and/or
maximum values. If the `minimum` or `maximum` is None, the function will ignore that bound.
Parameters
----------
data : Dataset | DataArray
The input xarray Dataset or DataArray to filter.
coordinate : str
The name of the coordinate within the Dataset or DataArray to apply the filter on.
minimum : float or None
The minimum value for the coordinate. If None, no lower bound is applied.
maximum : float or None
The maximum value for the coordinate. If None, no upper bound is applied.
drop : bool, optional
Whether to drop values that fall outside the range, by default True.
Returns
-------
Dataset | DataArray
The filtered xarray Dataset or DataArray, where values outside the
[minimum, maximum] range are dropped or masked.
Raises
------
ValueError
If the coordinate is not present in the input data.
Notes
-----
- If both `minimum` and `maximum` are None, the input data is returned unfiltered.
- Emits a warning via logger if any values exceed the bounds.
"""
if coordinate not in data.coords:
raise ValueError(f"Coordinate '{coordinate}' not found in the dataset.")
condition = True # Start with an always-true condition
if minimum is not None:
condition &= data[coordinate] >= minimum
if maximum is not None:
condition &= data[coordinate] <= maximum
# values outside the requested range ?
if numpy.any(~condition):
warning_message = f"{x_mark} The input data exceed the reference range [{minimum}, {maximum}]."
warning_alternative = f"{x_mark} [bold]The input data [red]exceed[/red] the reference range[/bold] [{minimum}, {maximum}]."
typer.echo(warning_message)
logger.warning(warning_message, alt=warning_alternative)
else:
success_message = f"{check_mark} The input data are within the reference range [{minimum}, {maximum}]."
typer.echo(success_message)
logger.debug(success_message)
return data.where(condition, drop=drop)
get_scale_and_offset ¶
Get scale and offset values from a netCDF file using xarray
Source code in pvgisprototype/api/series/open.py
def get_scale_and_offset(netcdf):
"""Get scale and offset values from a netCDF file using xarray"""
import xarray as xr
# Open the dataset using xarray
dataset = xr.open_dataset(netcdf)
# Get all dimensions
netcdf_dimensions = set(dataset.dims)
# Get all variables
netcdf_variables = set(dataset.data_vars)
# Assuming the first variable that is not a dimension is the target variable
variable = list(netcdf_variables.difference(netcdf_dimensions))[0]
# Get the variable's attributes
variable_attrs = dataset[variable].attrs
# Retrieve scale_factor and add_offset attributes if they exist
scale_factor = variable_attrs.get("scale_factor", None)
add_offset = variable_attrs.get("add_offset", None)
return (scale_factor, add_offset)
load_or_open_dataarray_from_dataset ¶
load_or_open_dataarray_from_dataset(
dataset: Path,
variable: str | None = None,
longitude: float | None = None,
latitude: float | None = None,
time: str | None = None,
column_numbers: str | None = None,
mask_and_scale: bool = False,
in_memory: bool = False,
method: str = "nearest",
tolerance: float = 0.1,
verbose: int = 0,
)
Load or open a variable from a dataset and select coordinates.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
dataset | Path | | required |
variable | str | None | | None |
longitude | float | None | | None |
latitude | float | None | | None |
time | str | None | | None |
mask_and_scale | bool | | False |
in_memory | bool | | False |
method | str | | 'nearest' |
tolerance | float | | 0.1 |
verbose | int | | 0 |
Source code in pvgisprototype/api/series/open.py
def load_or_open_dataarray_from_dataset(
dataset: Path,
variable: str | None = None,
longitude: float | None = None,
latitude: float | None = None,
time: str | None = None,
column_numbers: str | None = None,
mask_and_scale: bool = False,
in_memory: bool = False,
method: str = "nearest",
tolerance: float = 0.1,
verbose: int = 0,
):
"""
Load or open a variable from a dataset and select coordinates.
Parameters
----------
dataset: Path to the NetCDF dataset file.
variable: The variable name to extract from the dataset.
longitude: Longitude value to select.
latitude: Latitude value to select.
time: Time value to select.
mask_and_scale: Boolean to mask and scale data.
in_memory: Boolean to load dataset into memory.
method: Method for selecting nearest coordinates ('nearest').
tolerance: Tolerance level for selecting the nearest coordinate.
verbose: Verbosity level for logging.
"""
# Open the dataset
ds = open_data_set(
input_data=dataset,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=verbose,
)
# If a variable is specified, check and extract it, otherwise raise error
if variable:
if variable in ds.variables:
data_array = ds[variable]
else:
logger.error(f"{x_mark} Variable '{variable}' not found in the dataset!")
raise typer.Exit(code=33)
else:
logger.error(f"{x_mark} No variable specified!")
raise typer.Exit(code=33)
# Select coordinates for longitude, latitude, and time if provided
indexers = {}
if "longitude" in ds.coords and longitude:
indexers["longitude"] = longitude
elif "lon" in ds.coords and longitude:
indexers["lon"] = longitude
if "latitude" in ds.coords and latitude:
indexers["latitude"] = latitude
elif "lat" in ds.coords and latitude:
indexers["lat"] = latitude
if time:
indexers["time"] = time
# Apply selection using nearest method and tolerance if required
try:
data_array = data_array.sel(**indexers, method=method, tolerance=tolerance)
except Exception as e:
logger.error(f"Error in selecting data with given coordinates: {str(e)}")
raise typer.Exit(code=33)
if column_numbers:
try:
if "-" in column_numbers: # Handle range like '1-10'
start, end = map(int, column_numbers.split("-"))
data_array = data_array.isel(
center_wavelength=slice(start - 1, end)
) # Adjust to 0-based indexing
elif "," in column_numbers: # Handle list like '1,5,7'
indices = list(map(int, column_numbers.split(",")))
data_array = data_array.isel(
center_wavelength=[i - 1 for i in indices]
) # Adjust to 0-based indexing
else: # Handle single value like '1'
index = int(column_numbers) - 1 # Adjust to 0-based indexing
data_array = data_array.isel(center_wavelength=index)
except Exception as e:
logger.error(f"Error in processing column_numbers: {str(e)}")
raise typer.Exit(code=33)
if verbose > 0:
logger.debug(f"Data successfully loaded for variable '{variable}'.")
return data_array
open_data_array ¶
open_data_array(
input_data: str,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/series/open.py
def open_data_array(
input_data: str,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
""" """
# try:
# if in_memory:
# dataarray = xr.load_dataarray(
# filename_or_object=netcdf,
# mask_and_scale=mask_and_scale,
# )
# return dataarray
# except Exception as exc:
# typer.echo(f"Could not load the data in memory: {str(exc)}")
# try:
# dataarray = xr.open_dataarray(
# filename_or_object=input_data_file,
# mask_and_scale=mask_and_scale,
# )
# return dataarray
# except Exception as exc:
# typer.echo(f"Could not open the data: {str(exc)}")
# raise typer.Exit(code=33)
if in_memory:
if verbose > 0:
logger.debug(f"Loading data array '{input_data}' in memory...")
return load_or_open_dataarray(
function=xr.load_dataarray,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
else:
if verbose > 0:
logger.debug(f"Opening data array '{input_data}'...")
return load_or_open_dataarray(
function=xr.open_dataarray,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
open_data_set ¶
open_data_set(
input_data: Path,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = 0,
)
Open or load a dataset based on the input flags.
Source code in pvgisprototype/api/series/open.py
def open_data_set(
input_data: Path,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = 0,
):
"""Open or load a dataset based on the input flags."""
if in_memory:
if verbose > 0:
logger.debug(f"Loading dataset '{input_data}' in memory...")
return load_or_open_dataset(
function=xr.load_dataset,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
else:
if verbose > 0:
logger.debug(f"Opening dataset '{input_data}'...")
return load_or_open_dataset(
function=xr.open_dataset,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
open_xarray_supported_time_series_data ¶
Select location series
read_data_array_or_set ¶
read_data_array_or_set(
input_data: Path,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = 0,
)
Open the data and determine if it's a DataArray or Dataset.
Source code in pvgisprototype/api/series/open.py
def read_data_array_or_set(
input_data: Path,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = 0,
):
"""Open the data and determine if it's a DataArray or Dataset."""
# try reading an array
try:
if in_memory:
if verbose > 0:
logger.debug(
f" - {exclamation_mark} Trying to load {input_data} into memory as a DataArray...",
alt=f" - {exclamation_mark} [bold]Trying[/bold] to load {input_data} into memory as a DataArray...",
)
return load_or_open_dataarray(
function=xr.load_dataarray,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
else:
if verbose > 0:
logger.debug(
f" - {exclamation_mark} Trying to open {input_data} as a DataArray...",
alt=f" - {exclamation_mark} [bold]Trying[/bold] to open {input_data} as a DataArray...",
)
return load_or_open_dataarray(
function=xr.open_dataarray,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
# or a set
except:
try:
if in_memory:
if verbose > 0:
logger.debug(
f" - {exclamation_mark} Trying to load {input_data} into memory as a Dataset...",
alt=f" - {exclamation_mark} [bold]Trying[/bold] to load {input_data} into memory as a Dataset...",
)
return load_or_open_dataset(
function=xr.load_dataset,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
else:
if verbose > 0:
logger.debug(
f" - {exclamation_mark} Trying to open {input_data} as a Dataset...",
alt=f" - {exclamation_mark} [bold]Trying[/bold] to open {input_data} as a Dataset...",
)
return load_or_open_dataset(
function=xr.open_dataset,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
except Exception as e:
logger.error(
f"Error loading or opening data: {str(e)}",
alt=f"Error loading or opening data: {str(e)}",
)
raise typer.Exit(code=33)
select_coordinates ¶
select_coordinates(
data_array,
longitude: Longitude,
latitude: Latitude,
time: str = None,
method: str = "nearest",
tolerance: float = 0.1,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Select single pair of coordinates from a data array
Will select center coordinates if none of (longitude, latitude) are provided.
Source code in pvgisprototype/api/series/open.py
def select_coordinates(
data_array,
longitude: Longitude,
latitude: Latitude,
time: str = None,
method: str = "nearest",
tolerance: float = 0.1,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
"""Select single pair of coordinates from a data array
Will select center coordinates if none of (longitude, latitude) are
provided.
"""
indexers = set_location_indexers(
data_array=data_array,
longitude=longitude,
latitude=latitude,
verbose=verbose,
)
try:
if not time:
data_array = data_array.sel(
**indexers,
method=method,
)
else:
# Review-Me ------------------------------------------------------
data_array = data_array.sel(time=time, method=method).sel(
**indexers,
method=method,
tolerance=tolerance,
)
# Review-Me ------------------------------------------------------
except Exception as exception:
print(f"{x_mark} {ERROR_IN_SELECTING_DATA} : {exception}")
raise SystemExit(33)
return data_array
select_location_time_series ¶
select_location_time_series(
time_series: Path,
variable: str | None = None,
coordinate: str | None = None,
minimum: float | None = None,
maximum: float | None = None,
drop: bool = True,
longitude: Longitude = None,
latitude: Latitude = None,
neighbor_lookup: (
MethodForInexactMatches | None
) = nearest,
tolerance: float = 0.1,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Select a location from a time series data format supported by xarray
Source code in pvgisprototype/api/series/open.py
@log_function_call
def select_location_time_series(
time_series: Path, # Is None required ?
variable: str | None = None,
coordinate: str | None = None,
minimum: float | None = None,
maximum: float | None = None,
drop: bool = True,
longitude: Longitude = None,
latitude: Latitude = None,
neighbor_lookup: MethodForInexactMatches | None = MethodForInexactMatches.nearest,
tolerance: float = 0.1,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""Select a location from a time series data format supported by
xarray"""
context_message = (
f"i Executing data selection function : select_location_time_series()"
)
context_message_alternative = f"[yellow]i[/yellow] Executing [underline]data selection function[/underline] : select_location_time_series()"
logger.debug(context_message, alt=context_message_alternative)
# data_array = open_data_array(
# time_series,
# mask_and_scale,
# in_memory,
# )
data = read_data_array_or_set(
input_data=time_series,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=verbose,
)
if isinstance(data, xr.Dataset):
if not variable:
raise ValueError(
"You must specify a variable when selecting from a Dataset."
)
if variable not in data:
raise ValueError(f"Variable '{variable}' not found in the Dataset.")
data_array = data[variable] # Extract the DataArray from the Dataset
logger.debug(
f" {check_mark} Successfully extracted '{variable}' from '{data_array.name}'.",
alt=f" {check_mark} [green]Successfully[/green] extracted '{variable}' from '{data_array.name}'.",
)
elif isinstance(data, xr.DataArray):
data_array = data # It's already a DataArray, use it directly
else:
raise ValueError("Unsupported data type. Must be a DataArray or Dataset.")
# Is this correctly placed here ?
if coordinate and (minimum or maximum):
data_array = filter_xarray(
data=data_array,
coordinate=coordinate,
minimum=minimum,
maximum=maximum,
drop=drop,
)
indexers = set_location_indexers(
data_array=data_array,
longitude=longitude,
latitude=latitude,
verbose=verbose,
)
try:
location_time_series = data_array.sel(
**indexers,
method=neighbor_lookup,
tolerance=tolerance,
)
if location_time_series.isnull().all():
logger.warning("Selection returns an empty array or all NaNs.")
location_time_series.load() # load into memory for fast processing
except Exception as exception:
# Print the error message directly to stderr to ensure it's always shown
error_message = f"Error in selecting data from {time_series} : {exception}."
error_message_alternative = (
f"Error in selecting data from [code]{time_series}[/code] : {exception}."
)
print(f"{error_message}\n")
logger.error(
error_message,
alt=error_message_alternative,
)
raise SystemExit(33)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
logger.debug(
f" < Returning selected location from time series : {location_time_series}",
alt=f" [green bold]<[/green bold] [bold]Returning[/bold] selected [brown]location[/brown] from time series : {location_time_series}",
)
return location_time_series
select_location_time_series_from_array_or_set ¶
select_location_time_series_from_array_or_set(
data: Dataset | DataArray,
variable: str | None = None,
coordinate: str | None = None,
minimum: float | None = None,
maximum: float | None = None,
drop: bool = True,
longitude: Longitude = None,
latitude: Latitude = None,
neighbor_lookup: (
MethodForInexactMatches | None
) = nearest,
tolerance: float = 0.1,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Select location-specific time series data from xarray Dataset or DataArray.
Extracts time series data for a specific geographic location from xarray data structures. Supports variable selection from Datasets, coordinate filtering, spatial interpolation using neighbor lookup methods, and loads the result into memory for efficient processing.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
data | Dataset | DataArray | Input xarray Dataset or DataArray containing time series data with spatial (longitude, latitude) and temporal dimensions. | required |
variable | str | None | Variable name to extract from Dataset. Required when data is a Dataset, ignored when data is a DataArray, by default None | None |
coordinate | str | None | Coordinate dimension name for filtering by minimum/maximum values, by default None | None |
minimum | float | None | Minimum value for coordinate filtering. Used with coordinate parameter, by default None | None |
maximum | float | None | Maximum value for coordinate filtering. Used with coordinate parameter, by default None | None |
drop | bool | Whether to drop filtered coordinates from the result, by default True | True |
longitude | Longitude | Longitude coordinate for location selection, by default None | None |
latitude | Latitude | Latitude coordinate for location selection, by default None | None |
neighbor_lookup | MethodForInexactMatches | None | Spatial interpolation method when exact coordinate matches are not found, by default MethodForInexactMatches.nearest | nearest |
tolerance | float | Maximum distance tolerance for spatial interpolation, by default 0.1 | 0.1 |
verbose | int | Verbosity level for debug output, by default VERBOSE_LEVEL_DEFAULT | VERBOSE_LEVEL_DEFAULT |
log | int | Logging level for function call logging, by default LOG_LEVEL_DEFAULT | LOG_LEVEL_DEFAULT |
Returns:
| Type | Description |
|---|---|
DataArray | Location-specific time series data as an xarray DataArray loaded into memory for efficient processing. |
Raises:
| Type | Description |
|---|---|
ValueError | If variable is not specified when data is a Dataset, or if the specified variable is not found in the Dataset, or if data type is unsupported. |
SystemExit | If data selection fails due to indexing or other processing errors. |
Warnings
Logs a warning if the selection returns an empty array or all NaN values.
Notes
The function automatically loads the selected data into memory using .load() for fast downstream processing. Coordinate filtering is applied before location selection when minimum/maximum parameters are provided.
Source code in pvgisprototype/api/series/open.py
@log_function_call
def select_location_time_series_from_array_or_set(
data: Dataset | DataArray,
variable: str | None = None,
coordinate: str | None = None,
minimum: float | None = None,
maximum: float | None = None,
drop: bool = True,
longitude: Longitude = None,
latitude: Latitude = None,
neighbor_lookup: MethodForInexactMatches | None = MethodForInexactMatches.nearest,
tolerance: float = 0.1,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""Select location-specific time series data from xarray Dataset or DataArray.
Extracts time series data for a specific geographic location from xarray data
structures. Supports variable selection from Datasets, coordinate filtering,
spatial interpolation using neighbor lookup methods, and loads the result
into memory for efficient processing.
Parameters
----------
data : Dataset | DataArray
Input xarray Dataset or DataArray containing time series data with
spatial (longitude, latitude) and temporal dimensions.
variable : str | None, optional
Variable name to extract from Dataset. Required when data is a Dataset,
ignored when data is a DataArray, by default None
coordinate : str | None, optional
Coordinate dimension name for filtering by minimum/maximum values,
by default None
minimum : float | None, optional
Minimum value for coordinate filtering. Used with coordinate parameter,
by default None
maximum : float | None, optional
Maximum value for coordinate filtering. Used with coordinate parameter,
by default None
drop : bool, optional
Whether to drop filtered coordinates from the result,
by default True
longitude : Longitude, optional
Longitude coordinate for location selection,
by default None
latitude : Latitude, optional
Latitude coordinate for location selection,
by default None
neighbor_lookup : MethodForInexactMatches | None, optional
Spatial interpolation method when exact coordinate matches are not found,
by default MethodForInexactMatches.nearest
tolerance : float, optional
Maximum distance tolerance for spatial interpolation,
by default 0.1
verbose : int, optional
Verbosity level for debug output,
by default VERBOSE_LEVEL_DEFAULT
log : int, optional
Logging level for function call logging,
by default LOG_LEVEL_DEFAULT
Returns
-------
DataArray
Location-specific time series data as an xarray DataArray loaded into
memory for efficient processing.
Raises
------
ValueError
If variable is not specified when data is a Dataset, or if the specified
variable is not found in the Dataset, or if data type is unsupported.
SystemExit
If data selection fails due to indexing or other processing errors.
Warnings
--------
Logs a warning if the selection returns an empty array or all NaN values.
Notes
-----
The function automatically loads the selected data into memory using .load()
for fast downstream processing. Coordinate filtering is applied before
location selection when minimum/maximum parameters are provided.
"""
context_message = (
f"i Executing data selection function : select_location_time_series()"
)
context_message_alternative = f"[yellow]i[/yellow] Executing [underline]data selection function[/underline] : select_location_time_series()"
logger.debug(context_message, alt=context_message_alternative)
# data_array = open_data_array(
# time_series,
# mask_and_scale,
# in_memory,
# )
if isinstance(data, xr.Dataset):
if not variable:
raise ValueError(
"You must specify a variable when selecting from a Dataset."
)
if variable not in data:
raise ValueError(f"Variable '{variable}' not found in the Dataset.")
data_array = data[variable] # Extract the DataArray from the Dataset
logger.debug(
f" {check_mark} Successfully extracted '{variable}' from '{data_array.name}'.",
alt=f" {check_mark} [green]Successfully[/green] extracted '{variable}' from '{data_array.name}'.",
)
elif isinstance(data, xr.DataArray):
data_array = data # It's already a DataArray, use it directly
else:
raise ValueError("Unsupported data type. Must be a DataArray or Dataset.")
# Is this correctly placed here ?
if coordinate and (minimum or maximum):
data_array = filter_xarray(
data=data_array,
coordinate=coordinate,
minimum=minimum,
maximum=maximum,
drop=drop,
)
indexers = set_location_indexers(
data_array=data_array,
longitude=longitude,
latitude=latitude,
verbose=verbose,
)
try:
location_time_series = data_array.sel(
**indexers,
method=neighbor_lookup,
tolerance=tolerance,
)
if location_time_series.isnull().all():
logger.warning("Selection returns an empty array or all NaNs.")
location_time_series.load() # load into memory for fast processing
except Exception as exception:
# Print the error message directly to stderr to ensure it's always shown
error_message = f"Error in selecting data for {latitude}, {longitude} from {data} : {exception}."
error_message_alternative = f"Error in selecting data for {latitude}, {longitude} from [code]{data}[/code] : {exception}."
print(f"{error_message}\n")
logger.error(
error_message,
alt=error_message_alternative,
)
raise SystemExit(33)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
logger.debug(
f" < Returning selected location from time series : {location_time_series}",
alt=f" [green bold]<[/green bold] [bold]Returning[/bold] selected [brown]location[/brown] from time series : {location_time_series}",
)
return location_time_series
set_location_indexers ¶
set_location_indexers(
data_array,
longitude: Longitude = None,
latitude: Latitude = None,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Select single pair of coordinates from a data array
Will select center coordinates if none of (longitude, latitude) are provided.
Source code in pvgisprototype/api/series/open.py
def set_location_indexers(
data_array,
longitude: Longitude = None,
latitude: Latitude = None,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
"""Select single pair of coordinates from a data array
Will select center coordinates if none of (longitude, latitude) are
provided.
"""
# ----------------------------------------------------------- Deduplicate me
# Ugly hack for when dimensions 'longitude', 'latitude' are not spelled out!
# Use `coords` : a time series of a single pair of coordinates has only a `time` dimension!
indexers = {}
dimensions = [
dimension for dimension in data_array.coords if isinstance(dimension, str)
]
if set(["lon", "lat"]) & set(dimensions):
x = "lon"
y = "lat"
elif set(["longitude", "latitude"]) & set(dimensions):
x = "longitude"
y = "latitude"
if x and y:
logger.debug(
f" {check_mark} Location specific dimensions detected in '{data_array.name}' : {x}, {y}"
)
if not (longitude and latitude):
warning = f" {check_mark} Coordinates (longitude, latitude) not provided. Selecting center coordinates."
logger.warning(warning)
center_longitude = float(data_array[x][len(data_array[x]) // 2])
center_latitude = float(data_array[y][len(data_array[y]) // 2])
indexers[x] = center_longitude
indexers[y] = center_latitude
text_coordinates = f"{check_mark} Center coordinates (longitude, latitude) : {center_longitude}, {center_latitude}."
else:
indexers[x] = longitude
indexers[y] = latitude
text_coordinates = f" {check_mark} Coordinates : {longitude}, {latitude}."
logger.debug(text_coordinates)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return indexers
plot ¶
Functions:
| Name | Description |
|---|---|
get_coordinates | Ugly hack for when dimensions 'longitude', 'latitude' are not spelled out! |
plot_outliers | Plot outliers in location series |
plot_series | Plot series over a location |
get_coordinates ¶
Ugly hack for when dimensions 'longitude', 'latitude' are not spelled out! Use coords : a time series of a single pair of coordinates has only a time dimension!
Source code in pvgisprototype/api/series/plot.py
def get_coordinates(data_array: xr.DataArray) -> tuple:
"""
Ugly hack for when dimensions 'longitude', 'latitude' are not spelled out!
Use `coords` : a time series of a single pair of coordinates has only a `time` dimension!
"""
dimensions = [
dimension for dimension in data_array.coords if isinstance(dimension, str)
]
if set(["lon", "lat"]) & set(dimensions):
x = "lon"
y = "lat"
elif set(["longitude", "latitude"]) & set(dimensions):
x = "longitude"
y = "latitude"
if x and y:
logger.debug(f"Dimensions : {x}, {y}")
return x, y
plot_outliers ¶
plot_outliers(
data_array,
time,
outliers,
sensitivity_factor,
figure_name,
add_offset=None,
variable_name_as_suffix=None,
)
Plot outliers in location series
Source code in pvgisprototype/api/series/plot.py
def plot_outliers(
data_array,
time,
outliers,
sensitivity_factor,
figure_name,
add_offset=None,
variable_name_as_suffix=None,
):
"""
Plot outliers in location series
"""
_, ax = plt.subplots(figsize=(16, 9))
data_array.plot(alpha=0.7)
outliers.plot.line(
"rd", ms=7, label=f"Outliers (sensitivity : {sensitivity_factor})"
)
with_sensitivity_factor = "_iqr_with_sensitivity_" + str(
sensitivity_factor
).replace(".", "")
figure_name = Path(str(figure_name) + with_sensitivity_factor + "_on")
if variable_name_as_suffix:
if data_array.long_name:
long_name = data_array.long_name.replace(" ", "_").lower()
figure_name = Path(str(figure_name) + "_" + long_name)
else:
name = data_array.name.replace(" ", "_")
figure_name = Path(str(figure_name) + "_" + name)
if time:
time = str(time).replace("-", "")
figure_name = Path(str(figure_name) + "_" + str(time))
if data_array.long_name:
plt.suptitle(f"{data_array.long_name}")
else:
plt.suptitle(f"{data_array.name}")
# ----------------------------------------------------------- Deduplicate me
# Ugly hack for when dimensions 'longitude', 'latitude' are not spelled out!
# Use `coords` : a time series of a single pair of coordinates has only a `time` dimension!
dimensions = [
dimension for dimension in data_array.coords if isinstance(dimension, str)
]
if set(["lon", "lat"]) & set(dimensions):
x = "lon"
y = "lat"
elif set(["longitude", "latitude"]) & set(dimensions):
x = "longitude"
y = "latitude"
# Deduplicate me -----------------------------------------------------------
title = f"({data_array[x].name}, {data_array[y].name}) {data_array[x].values}, {data_array[y].values}"
plt.title(title)
# Special case : ----------------------------------------------------------
# Scaling a map is done via :
# `output = input * scale + offset`.
# This operation might be technically successful.
# Hoever, if the input map is empty (say something went wrong in reading it),
# the output map will present pixel values equal to the offset!
# In pseudo-code:
# `if input == 0 : output = offset`
outliers_values = np.unique(outliers.values[~np.isnan(outliers.values)])
if add_offset in outliers_values:
plt.text(
0.5,
0.1,
# f'Outlier equals to offset!\nScale : {scale_factor}, Offset : {add_offset}',
f"Outlier equals to Offset : {add_offset}",
color="indigo",
ha="center",
va="center",
transform=ax.transAxes,
)
plt.xlabel("")
# Look out for this! ------------------------------------------------------
plt.legend(loc="upper right")
file_extension = "png"
output_filename = f"{figure_name}.{file_extension}"
plt.savefig(f"{output_filename}")
number_of_outliers = len(outliers_values)
logger.debug(
f"{check_mark} Time series plot of {number_of_outliers} values over ({float(data_array[x])}, {float(data_array[y])}) exported in {output_filename}!"
)
print(
f"{check_mark} Time series plot of {number_of_outliers} values over ({float(data_array[x])}, {float(data_array[y])}) exported in {output_filename}!"
)
return output_filename
plot_series ¶
plot_series(
data_array,
time: DatetimeIndex,
default_dimension="time",
ask_for_dimension=True,
figure_name: str = "series_plot",
save_path: Path = cwd(),
file_extension: str = "png",
add_offset: bool = False,
variable_name_as_suffix: bool = None,
tufte_style: bool = None,
width: int = 16,
height: int = 9,
resample_large_series: bool = False,
data_source: str = "",
fingerprint: bool = False,
)
Plot series over a location
Source code in pvgisprototype/api/series/plot.py
def plot_series(
data_array,
time: DatetimeIndex,
default_dimension="time",
ask_for_dimension=True,
# slice_options=None,
figure_name: str = "series_plot",
save_path: Path = Path.cwd(),
file_extension: str = "png",
add_offset: bool = False,
variable_name_as_suffix: bool = None,
tufte_style: bool = None,
width: int = 16,
height: int = 9,
resample_large_series: bool = False,
data_source: str = "",
fingerprint: bool = False,
):
"""
Plot series over a location
"""
if not isinstance(data_array, xr.DataArray):
raise ValueError(
f"The input array {data_array} is not an xarray DataArray and cannot be plotted!"
)
x, y = get_coordinates(data_array)
# Prepare plot
fig, ax = plt.subplots(figsize=(width, height))
# Set grid properties
# ax.grid(color='grey', linestyle='-', linewidth=0.5, alpha=0.5, zorder=0)
# Plot data
if resample_large_series:
logger.debug(
f"Request for `--resample-large-series`",
alt=f"Request for `--resample-large-series`",
)
data_array = data_array.resample(time="1D").mean()
logger.debug(
f"Resampled data array : {data_array}",
alt=f"Resampled data array : {data_array}",
)
dimensions = list(data_array.dims)
num_dimensions = len(dimensions)
if num_dimensions == 1:
data_array.plot(
ax=ax,
alpha=0.5,
color="black",
linewidth=1,
marker="o",
markersize=3,
zorder=1,
)
# Remove unwanted spines
ax.spines["top"].set_visible(False)
ax.spines["right"].set_visible(False)
# Remove x-axis label
plt.xlabel("")
# Set title with fallback for missing 'long_name'
supertitle = getattr(data_array, "long_name", None)
fig.suptitle(
supertitle,
fontsize="xx-large",
ha="right",
va="top",
x=0.9,
y=0.95,
# rotation=270,
)
coordinate_x = round(float(data_array[x]), 3)
coordinate_y = round(float(data_array[y]), 3)
title = f"({data_array[x].name}, {data_array[y].name}) "
title += f"{coordinate_x}, {coordinate_y}"
ax.set_title(
title,
fontsize="xx-large",
ha="left",
va="top",
x=0.7,
y=0.95,
# rotation=270,
)
# supertitle += f'\n{title}'
# Format tick labels
ax.tick_params(axis="both", which="major", labelsize=14, direction="in")
ax.ticklabel_format(
axis="y",
style="scientific",
# scilimits=(5, 5),
useOffset=None,
useLocale=None,
useMathText=None,
)
if tufte_style:
# First, get minimum and maximum values
minimum_value = float(data_array.min())
minimum_value = np.fix(minimum_value) # if close to 0
maximum_value = float(data_array.max())
# X limits
x_limits = ax.get_xlim()
ax.set_xlim(x_limits[0], x_limits[1])
# X spine
ax.spines["bottom"].set_linewidth(0.5)
# Convert datetime to numerical representation
minimum_timestamp = mdates.date2num(data_array.time.values[0])
maximum_timestamp = mdates.date2num(data_array.time.values[-1])
ax.spines["bottom"].set_bounds(minimum_timestamp, maximum_timestamp)
# Y spine
ax.spines["left"].set_linewidth(0.5)
ax.spines["left"].set_bounds(minimum_value, maximum_value)
# Only show ticks on bottom and left frame
ax.get_xaxis().tick_bottom()
ax.get_yaxis().tick_left()
# Calculate tick positions
# -----------------------------------------------------------
# # Set the y-ticks to align with the minimum and maximum values
# ax.set_yticks([minimum_value, maximum_value])
# # Add an extra tick for the maximum value
# ax.set_yticks(ax.get_yticks().tolist() + [maximum_value])
# -----------------------------------------------------------
num_ticks = 5 # Adjust the number of ticks as desired
tick_locations = np.linspace(minimum_value, maximum_value, num_ticks)
# Align y-ticks with tick positions
ax.set_yticks(tick_locations)
# Set axis labels
# ax.set_xlabel(data_array['time'].name, fontsize=16)
# ax.set_ylabel(data_array[y].units, fontsize=18)
# Do not plot the 'normal' title
fig.suptitle("")
plt.title("")
# Plot title on the side
if getattr(data_array, "long_name", None):
# supertitle = f'{data_array.long_name}'
# supertitle += f'\n{title}'
# Adjust the positioning slightly to the right of the plot
right_margin_offset = 0.02 # Adjust as needed based on figure size
text_x_position = (
1 + right_margin_offset
) # 1 corresponds to the far right of the plot
text_background_box = dict(
facecolor="white",
alpha=0.5,
edgecolor="none",
boxstyle="round,pad=0.5",
)
# supertitle_right = ax.text(
# text_x_position, # maximum_timestamp,
# 1, # maximum_value,
# f"{data_array.long_name}",
# fontsize="x-large",
# bbox=text_background_box,
# va="top",
# ha="right",
# transform=ax.transAxes, # ensure positioning is relative to axes size
# )
# supertitle_right_bbox = supertitle_right.get_window_extent()
# supertitle_right_height = supertitle_right_bbox.height
# semi-transparent background box for legibility ?
ax.text(
text_x_position, # maximum_timestamp,
1, # supertitle_right_bbox.y0 - supertitle_right_height,
f"{title}",
fontsize="large",
bbox=text_background_box,
va="top",
ha="right",
transform=ax.transAxes, # ensure positioning is relative to axes size
)
else:
# plt.suptitle(f'{data_array.name}')
# Axis labels as a title annotation.
ax.text(
data_array.time[-1],
maximum_value,
f"{data_array.name}",
fontsize="x-large",
)
elif num_dimensions > 1:
# Set title with fallback for missing 'long_name'
supertitle = getattr(data_array, "long_name", None)
fig.suptitle(
supertitle,
fontsize="xx-large",
ha="right",
va="top",
x=0.9,
y=0.95,
# rotation=270,
)
default_dimension = "time"
print(f"Detected complex structure with dimensions: {dimensions}.")
# if ask_for_dimension:
# print(f"Please specify a dimension to plot over (choose from: {dimensions}):")
# plot_dimension = input("Dimension: ")
# else:
# # Use default dimension if available
# plot_dimension = default_dimension if default_dimension in dimensions else dimensions[0]
# ---
print(
f"Do you want to specify a dimension other than '{default_dimension}' to plot over (choose from: {dimensions}):"
)
plot_dimension = input("Dimension: ")
# plot_dimension = default_dimension if default_dimension in dimensions else dimensions[0]
if plot_dimension not in dimensions:
raise ValueError(
f"Invalid dimension: {plot_dimension}. Available dimensions: {dimensions}"
)
if plot_dimension == default_dimension:
data_to_plot = data_array.mean(
dim=[dim for dim in dimensions if dim != plot_dimension]
)
print(
f"Aggregating over other dimensions. From {data_to_plot} plotting {plot_dimension} vs data."
)
data_to_plot.plot()
# elif slice_options and plot_dimension in slice_options:
# slice_values = slice_options[plot_dimension]
# fig, axes = plt.subplots(len(slice_values), 1, figsize=(10, 5 * len(slice_values)))
# for i, slice_value in enumerate(slice_values):
# data_slice = data_array.sel({plot_dimension: slice_value})
# data_slice.plot(ax=axes[i], alpha=0.5, color="black", linewidth=1)
# axes[i].set_title(f'{plot_dimension.capitalize()}: {slice_value}')
else:
print(f"Aggregating over other dimensions for {plot_dimension}.")
data_to_plot = data_array.mean(
dim=[dim for dim in dimensions if dim != plot_dimension]
)
data_to_plot.plot(alpha=0.5, color="black", linewidth=1)
# Identity
plt.subplots_adjust(bottom=0.18)
identity_text = f"© PVGIS" f" · Joint Research Centre, European Commission"
if data_source:
identity_text += f" · Data source : {data_source}"
if fingerprint:
from pvgisprototype.core.hashing import generate_hash
data_array_hash = generate_hash(data_array)
identity_text += f" · Fingerprint : {data_array_hash}"
fig.text(
0.5,
0.02,
identity_text,
fontsize=12,
color="gray",
ha="center",
alpha=0.5,
)
if figure_name:
# Handle variable name as suffix
if variable_name_as_suffix:
name_suffix = (
getattr(data_array, "long_name", data_array.name)
.replace(" ", "_")
.lower()
)
figure_name = f"{figure_name}_{name_suffix}"
# Handle time-based naming
if isinstance(time, (list, tuple)) and len(time) == 1:
time = time[0]
time_string = str(time).replace("-", "")
else:
start_time = data_array.time.to_series().iloc[0].strftime("%Y%m%d%H%M%S")
end_time = data_array.time.to_series().iloc[-1].strftime("%Y%m%d%H%M%S")
time_string = f"{start_time}_{end_time}"
figure_name = f"{figure_name}_{time_string}"
else:
figure_name = "series_plot" # the long name of the input data array ?
if supertitle:
figure_name += f"_{supertitle}" # the long name of the input data array ?
from datetime import datetime
figure_name += f"_{datetime.now().strftime('%Y%m%d%H%M%S')}"
if fingerprint:
from pvgisprototype.core.hashing import generate_hash
figure_name += f"{data_array_hash}"
# plt.legend(loc='upper right')
# Save figure
if not tufte_style:
plt.tight_layout()
output_filename = f"{figure_name}.{file_extension}"
plt.savefig(save_path / output_filename)
# dpi=300,
# bbox_inches='tight'
# )
# Report
number_of_values = int(data_array.count())
logger.debug(
f"{check_mark} Time series plot of {number_of_values} values over ({float(data_array[x])}, {float(data_array[y])}) exported in {output_filename}!"
)
print(
f"[green]{check_mark}[/green] Time series plot of {number_of_values} values over ({float(data_array[x])}, {float(data_array[y])}) exported in '{output_filename}'"
)
return output_filename
relative_humidity ¶
Functions:
| Name | Description |
|---|---|
get_relative_humidity_series | |
get_relative_humidity_series_from_array_or_set | Extract relative humidity time series from xarray DataArray or Dataset. |
get_relative_humidity_series ¶
get_relative_humidity_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex = str(now()),
relative_humidity_series: (
RelativeHumiditySeries | Path
) = array(TEMPERATURE_DEFAULT),
neighbor_lookup: MethodForInexactMatches = nearest,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/series/relative_humidity.py
def get_relative_humidity_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex = str(Timestamp.now()),
relative_humidity_series: RelativeHumiditySeries | Path = array(TEMPERATURE_DEFAULT),
neighbor_lookup: MethodForInexactMatches = MethodForInexactMatches.nearest,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
""" """
if isinstance(relative_humidity_series, Path):
from pvgisprototype.api.series.select import select_time_series
# from pvgisprototype.api.utilities.conversions import (
# convert_float_to_degrees_if_requested,
# )
# from pvgisprototype.constants import DEGREES
relative_humidity_times_series = (
select_time_series(
time_series=relative_humidity_series,
# longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
longitude=longitude,
# latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
latitude=latitude,
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if relative_humidity_times_series.size == 1 and relative_humidity_times_series.shape == ():
relative_humidity_times_series = array([relative_humidity_times_series], dtype=dtype)
return RelativeHumiditySeries(
value=relative_humidity_times_series,
# unit=SYMBOL_UNIT_TEMPERATURE,
data_source=relative_humidity_series.name,
)
else:
return relative_humidity_series
get_relative_humidity_series_from_array_or_set ¶
get_relative_humidity_series_from_array_or_set(
longitude: float,
latitude: float,
relative_humidity_series: DataArray | Dataset,
timestamps: DatetimeIndex = str(now()),
neighbor_lookup: (
MethodForInexactMatches | None
) = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Extract relative humidity time series from xarray DataArray or Dataset.
Selects and extracts relative humidity data for a specific geographic location and time period from an xarray DataArray or Dataset. Performs spatial interpolation using the specified neighbor lookup method and temporal selection based on the provided timestamps. Returns a structured RelativeHumiditySeries object with proper units and metadata.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
longitude | float | Longitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
latitude | float | Latitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
relative_humidity_series | DataArray | Dataset | Input xarray DataArray or Dataset containing relative humidity data with spatial (longitude, latitude) and temporal dimensions. | required |
timestamps | DatetimeIndex | Time index for temporal selection of the data, by default str(Timestamp.now()) | str(now()) |
neighbor_lookup | MethodForInexactMatches | None | Method for spatial interpolation when exact coordinate matches are not found, by default NEIGHBOR_LOOKUP_DEFAULT | NEIGHBOR_LOOKUP_DEFAULT |
tolerance | float | None | Maximum distance tolerance for spatial interpolation, by default TOLERANCE_DEFAULT | TOLERANCE_DEFAULT |
dtype | str | Data type for the output numpy array values, by default DATA_TYPE_DEFAULT | DATA_TYPE_DEFAULT |
log | int | Logging level for debug output, by default LOG_LEVEL_DEFAULT | LOG_LEVEL_DEFAULT |
Returns:
| Type | Description |
|---|---|
RelativeHumiditySeries | Structured relative humidity time series object containing: - value: 1D numpy array with relative humidity values - unit: RelativeHumidity unit designation (typically Celsius or Kelvin) - data_source: Original data source name from input |
Raises:
| Type | Description |
|---|---|
TypeError | If relative_humidity_series is not a DataArray or Dataset. |
Notes
The function automatically handles coordinate conversion to ensure compatibility with the underlying data. Scalar results are converted to 1D arrays for consistency in downstream processing. RelativeHumidity units are preserved from the original dataset metadata.
Source code in pvgisprototype/api/series/relative_humidity.py
def get_relative_humidity_series_from_array_or_set(
longitude: float,
latitude: float,
relative_humidity_series: DataArray | Dataset,
timestamps: DatetimeIndex = str(Timestamp.now()),
neighbor_lookup: MethodForInexactMatches | None = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""Extract relative humidity time series from xarray DataArray or Dataset.
Selects and extracts relative humidity data for a specific geographic location
and time period from an xarray DataArray or Dataset. Performs spatial
interpolation using the specified neighbor lookup method and temporal selection
based on the provided timestamps. Returns a structured RelativeHumiditySeries
object with proper units and metadata.
Parameters
----------
longitude : float
Longitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
latitude : float
Latitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
relative_humidity_series : DataArray | Dataset
Input xarray DataArray or Dataset containing relative humidity data
with spatial (longitude, latitude) and temporal dimensions.
timestamps : DatetimeIndex, optional
Time index for temporal selection of the data,
by default str(Timestamp.now())
neighbor_lookup : MethodForInexactMatches | None, optional
Method for spatial interpolation when exact coordinate matches are not found,
by default NEIGHBOR_LOOKUP_DEFAULT
tolerance : float | None, optional
Maximum distance tolerance for spatial interpolation,
by default TOLERANCE_DEFAULT
dtype : str, optional
Data type for the output numpy array values,
by default DATA_TYPE_DEFAULT
log : int, optional
Logging level for debug output,
by default LOG_LEVEL_DEFAULT
Returns
-------
RelativeHumiditySeries
Structured relative humidity time series object containing:
- value: 1D numpy array with relative humidity values
- unit: RelativeHumidity unit designation (typically Celsius or Kelvin)
- data_source: Original data source name from input
Raises
------
TypeError
If relative_humidity_series is not a DataArray or Dataset.
Notes
-----
The function automatically handles coordinate conversion to ensure compatibility
with the underlying data. Scalar results are converted to 1D arrays for
consistency in downstream processing. RelativeHumidity units are preserved from
the original dataset metadata.
"""
from pvgisprototype.api.series.select import select_time_series_from_array_or_set
if isinstance(relative_humidity_series, DataArray | Dataset):
# from pvgisprototype.api.utilities.conversions import (
# convert_float_to_degrees_if_requested,
# )
# from pvgisprototype.constants import DEGREES
relative_humidity_times_series = (
select_time_series_from_array_or_set(
data=relative_humidity_series,
# longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
longitude=longitude,
# latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
latitude=latitude,
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if relative_humidity_times_series.size == 1 and relative_humidity_times_series.shape == ():
relative_humidity_times_series = array([relative_humidity_times_series], dtype=dtype)
else:
raise TypeError("RelativeHumidity series must be a DataArray or Dataset.")
return RelativeHumiditySeries(
value=relative_humidity_times_series,
data_source=relative_humidity_series.name,
)
select ¶
Functions:
| Name | Description |
|---|---|
remap_to_2013 | Point to 2013-01-01 for spectral effect factor/s. This is a HACK.- |
select_time_series | Select location series |
select_time_series_from_array_or_set | Select location-specific time series with temporal filtering and validation. |
remap_to_2013 ¶
Point to 2013-01-01 for spectral effect factor/s. This is a HACK.-
The spectral effect factor maps generated by Thomas Huld comprise a set of 12 monthly global maps for the reference year 2013.
Since we have, currently, no other data except for the monthly 2013 values, this function remaps timestamps to point to the beginning of the(ir) month and to the year 2013.
Adjusts the year of the given timestamp to 2013, handling specific edge cases: - Moves December timestamps to January 1, 2013. - Adjusts February 29 to February 28, 2013, for non-leap years.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
ts | The timestamp to adjust. | required |
Returns:
| Name | Type | Description |
|---|---|---|
datetime | The adjusted timestamp. |
Raises:
| Type | Description |
|---|---|
ValueError: | If the date adjustment fails due to an invalid date that is not February 29. |
Source code in pvgisprototype/api/series/select.py
def remap_to_2013(ts):
"""Point to 2013-01-01 for spectral effect factor/s. This is a HACK.-
The spectral effect factor maps generated by Thomas Huld comprise a set of
12 monthly global maps for the reference year 2013.
Since we have, currently, no other data except for the monthly 2013 values,
this function remaps timestamps to point to the beginning of the(ir) month
and to the year 2013.
Adjusts the year of the given timestamp to 2013, handling specific edge cases:
- Moves December timestamps to January 1, 2013.
- Adjusts February 29 to February 28, 2013, for non-leap years.
Parameters
----------
ts: datetime
The timestamp to adjust.
Returns
-------
datetime:
The adjusted timestamp.
Raises
------
ValueError:
If the date adjustment fails due to an invalid date that is not
February 29.
"""
try:
if ts.month == 12:
return ts.replace(year=2013, month=1, day=1)
return ts.replace(year=2013)
except ValueError:
if ts.month == 2 and ts.day == 29:
return ts.replace(year=2013, day=28)
raise
select_time_series ¶
select_time_series(
time_series: Path | None,
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex | None,
start_time: datetime | None = None,
end_time: datetime | None = None,
remap_to_month_start: bool = False,
variable: str | None = None,
coordinate: str | None = None,
minimum: float | None = None,
maximum: float | None = None,
drop: bool = True,
neighbor_lookup: MethodForInexactMatches | None = None,
tolerance: float | None = 0.1,
time_tolerance: str = "15m",
mask_and_scale: bool = False,
in_memory: bool = False,
variable_name_as_suffix: bool = True,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Select location series
Source code in pvgisprototype/api/series/select.py
@log_function_call
@custom_cached
def select_time_series(
time_series: Path | None,
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex | None,
start_time: datetime | None = None,
end_time: datetime | None = None,
remap_to_month_start: bool = False,
# convert_longitude_360: bool = False,
variable: str | None = None,
coordinate: str | None = None,
minimum: float | None = None,
maximum: float | None = None,
drop: bool = True,
neighbor_lookup: MethodForInexactMatches | None = None,
tolerance: float | None = 0.1, # Customize default if needed
time_tolerance: str = "15m", # Important for merged Datasets
mask_and_scale: bool = False,
in_memory: bool = False,
variable_name_as_suffix: bool = True,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""Select location series"""
if time_series is None:
return None
# if convert_longitude_360:
# longitude = longitude % 360
# warn_for_negative_longitude(longitude)
path_to = f"{time_series.parent.absolute()}"
path_to_alternative = f"[code]{path_to}[/code]"
data_description = f"Data file in {path_to} : {time_series.name}"
data_description_alternative = (
f"Data file in {path_to_alternative} : [code]{time_series.name}[/code]"
)
logger.debug(data_description, alt=data_description_alternative)
scale_factor, add_offset = get_scale_and_offset(time_series)
logger.debug(
f"Scale factor : {scale_factor}, Offset : {add_offset}",
alt=f"Scale factor : {scale_factor}, Offset : {add_offset}",
)
if longitude and latitude:
coordinates = f"Requested location coordinates : {longitude}, {latitude}"
coordinates_alternative = (
f"[bold]Requested[/bold] location coordinates : {longitude}, {latitude}"
)
logger.debug(coordinates, alt=coordinates_alternative)
location_time_series = select_location_time_series(
time_series=time_series,
coordinate=coordinate,
minimum=minimum,
maximum=maximum,
drop=drop,
longitude=longitude,
latitude=latitude,
variable=variable,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=verbose,
# log=log,
)
logger.debug(
f"Selected location from time series : {location_time_series}",
alt=f"Selected [brown]location[/brown] from time series : {location_time_series}",
)
# ------------------------------------------------------------------------
if (start_time or end_time) and not remap_to_month_start:
timestamps = None # we don't need a timestamp anymore!
if start_time and not end_time: # set `end_time` to end of series
end_time = location_time_series.time.values[-1]
elif end_time and not start_time: # set `start_time` to beginning of series
start_time = location_time_series.time.values[0]
else: # Convert `start_time` & `end_time` to the correct string format
start_time = start_time.strftime("%Y-%m-%d %H:%M:%S")
end_time = end_time.strftime("%Y-%m-%d %H:%M:%S")
try:
location_time_series = location_time_series.sel(
time=slice(start_time, end_time)
)
except Exception:
logger.exception(
f"No data found for the given period {start_time} and {end_time}."
)
if remap_to_month_start:
logger.debug(
f"Remapping all timestaps for {time_series.name} to the reference year 2013",
alt=f"[bold]Remapping[/bold] all timestaps for {time_series.name} to the reference year 2013",
)
remapped_timestamps = timestamps.map(lambda ts: remap_to_2013(ts))
if not remapped_timestamps.empty:
from pandas import date_range
month_start_timestamps = date_range(
start=remapped_timestamps.min().normalize(),
end=remapped_timestamps.max(),
freq="MS",
)
try:
location_time_series = location_time_series.sel(
time=month_start_timestamps, method=neighbor_lookup
)
except Exception:
logger.exception(
f"No data found for the given 'month start' timestamps {month_start_timestamps}.",
alt=f"[red]No data found for the given 'month start' timestamps {month_start_timestamps}[/red].",
)
else:
error_message = "Remapped timestamps are empty, cannot proceed with date range creation."
logger.error(error_message)
raise ValueError(error_message)
if timestamps is not None and not start_time and not end_time:
data_time_min = location_time_series.time.min().values
data_time_max = location_time_series.time.max().values
# Check if all timestamps fall outside the temporal range of the dataset
if not remap_to_month_start and (
timestamps.min() < data_time_min or timestamps.max() > data_time_max
):
raise ValueError(
f"All requested timestamps fall outside the data's time range "
f"({data_time_min} to {data_time_max})."
)
if len(timestamps) == 1:
logger.warning(
f"Single timestamp selected!",
alt=f"[bold][yellow]Single timestamp selected![/bold][yellow]",
)
timestamps = start_time = end_time = timestamps[0]
try:
location_time_series = location_time_series.sel(
time=timestamps,
method=neighbor_lookup,
# tolerance=time_tolerance,
)
if (
"time" in location_time_series.coords
and location_time_series.time.size > 1
):
if location_time_series.indexes["time"].duplicated().any():
logger.error(
f"Duplicate timestamps detected in location_time_series.",
alt=f"[red]Duplicate timestamps detected in location_time_series![/red]",
)
if (
not remap_to_month_start
and location_time_series.indexes["time"].duplicated().any()
):
raise ValueError("Duplicate timestaps detected!")
logger.debug(
f"Selected timestamps from location time series : {location_time_series}",
alt=f"[bold]Selected[/bold] [blue]timestamps[/blue] from [brown]location[/brown] time series : {location_time_series}",
)
else:
logger.debug(
f"Single timestamp selected: {location_time_series.time.values}"
)
except KeyError:
error_message = f"No data found for one or more of the requested timestamps : {timestamps}."
logger.exception(
f"No data found for one or more of the requested timestamps : {timestamps}."
)
raise ValueError(error_message)
if location_time_series.size == 1:
single_value = float(location_time_series.values)
warning = (
f"{exclamation_mark} The selected timestamp "
+ f"{location_time_series.time.values}"
+ " matches the single value "
+ f"{single_value}"
)
logger.warning(warning)
if verbose > 0:
logger.warning(warning)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=location_time_series.values,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return location_time_series
select_time_series_from_array_or_set ¶
select_time_series_from_array_or_set(
data: Dataset | DataArray,
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex | None,
start_time: datetime | None = None,
end_time: datetime | None = None,
remap_to_month_start: bool = False,
variable: str | None = None,
coordinate: str | None = None,
minimum: float | None = None,
maximum: float | None = None,
drop: bool = True,
neighbor_lookup: MethodForInexactMatches = none,
tolerance: float | None = 0.1,
time_tolerance: str = "15m",
variable_name_as_suffix: bool = True,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Select location-specific time series with temporal filtering and validation.
Extracts time series data for a specific geographic location from xarray data structures with comprehensive temporal selection capabilities. Supports multiple temporal selection modes including timestamp-based selection, time range slicing, and month start remapping. Includes extensive validation and error handling for temporal bounds and duplicate detection.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
data | Dataset | DataArray | Input xarray Dataset or DataArray containing time series data with spatial (longitude, latitude) and temporal dimensions. | required |
longitude | Longitude | Longitude coordinate for location selection. | required |
latitude | Latitude | Latitude coordinate for location selection. | required |
timestamps | DatetimeIndex | None | Specific timestamps for temporal selection. If None, uses start_time/end_time for range selection. | required |
start_time | datetime | None | Start time for temporal range selection. Mutually exclusive with timestamps, by default None | None |
end_time | datetime | None | End time for temporal range selection. Mutually exclusive with timestamps, by default None | None |
remap_to_month_start | bool | Whether to remap all timestamps to month start dates in reference year 2013, by default False | False |
variable | str | None | Variable name to extract from Dataset. Required when data is a Dataset, by default None | None |
coordinate | str | None | Coordinate dimension name for filtering by minimum/maximum values, by default None | None |
minimum | float | None | Minimum value for coordinate filtering. Used with coordinate parameter, by default None | None |
maximum | float | None | Maximum value for coordinate filtering. Used with coordinate parameter, by default None | None |
drop | bool | Whether to drop filtered coordinates from the result, by default True | True |
neighbor_lookup | MethodForInexactMatches | None | Spatial interpolation method when exact coordinate matches are not found, by default None | none |
tolerance | float | None | Maximum distance tolerance for spatial interpolation, by default 0.1 | 0.1 |
time_tolerance | str | Temporal tolerance for timestamp matching in merged datasets, by default "15m" | '15m' |
variable_name_as_suffix | bool | Whether to use variable name as suffix in output naming, by default True | True |
verbose | int | Verbosity level for debug output and warnings, by default VERBOSE_LEVEL_DEFAULT | VERBOSE_LEVEL_DEFAULT |
log | int | Logging level for function call logging and data fingerprinting, by default LOG_LEVEL_DEFAULT | LOG_LEVEL_DEFAULT |
Returns:
| Type | Description |
|---|---|
DataArray | Location and temporally filtered time series data as an xarray DataArray. Single timestamp selections return scalar-like DataArrays. |
Raises:
| Type | Description |
|---|---|
ValueError | If requested timestamps fall outside the data's temporal range, if duplicate timestamps are detected when not using month start remapping, or if remapped timestamps are empty. |
KeyError | If no data is found for one or more requested timestamps. |
Warnings
Logs warnings for single timestamp selections and single value results.
Notes
The function supports three temporal selection modes: 1. Timestamp-based: Uses specific timestamps for selection 2. Range-based: Uses start_time/end_time for slice selection 3. Month start remapping: Maps timestamps to month starts in 2013
Temporal validation ensures requested timestamps fall within data bounds. The function is cached using @custom_cached decorator for performance. Data fingerprinting is performed for debugging and validation purposes.
Source code in pvgisprototype/api/series/select.py
def select_time_series_from_array_or_set(
data: Dataset | DataArray,
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex | None,
start_time: datetime | None = None,
end_time: datetime | None = None,
remap_to_month_start: bool = False,
# convert_longitude_360: bool = False,
variable: str | None = None,
coordinate: str | None = None,
minimum: float | None = None,
maximum: float | None = None,
drop: bool = True,
neighbor_lookup: MethodForInexactMatches = MethodForInexactMatches.none,
tolerance: float | None = 0.1, # Customize default if needed
time_tolerance: str = "15m", # Important for merged Datasets
variable_name_as_suffix: bool = True,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""Select location-specific time series with temporal filtering and validation.
Extracts time series data for a specific geographic location from xarray data
structures with comprehensive temporal selection capabilities. Supports multiple
temporal selection modes including timestamp-based selection, time range slicing,
and month start remapping. Includes extensive validation and error handling
for temporal bounds and duplicate detection.
Parameters
----------
data : Dataset | DataArray
Input xarray Dataset or DataArray containing time series data with
spatial (longitude, latitude) and temporal dimensions.
longitude : Longitude
Longitude coordinate for location selection.
latitude : Latitude
Latitude coordinate for location selection.
timestamps : DatetimeIndex | None
Specific timestamps for temporal selection. If None, uses start_time/end_time
for range selection.
start_time : datetime | None, optional
Start time for temporal range selection. Mutually exclusive with timestamps,
by default None
end_time : datetime | None, optional
End time for temporal range selection. Mutually exclusive with timestamps,
by default None
remap_to_month_start : bool, optional
Whether to remap all timestamps to month start dates in reference year 2013,
by default False
variable : str | None, optional
Variable name to extract from Dataset. Required when data is a Dataset,
by default None
coordinate : str | None, optional
Coordinate dimension name for filtering by minimum/maximum values,
by default None
minimum : float | None, optional
Minimum value for coordinate filtering. Used with coordinate parameter,
by default None
maximum : float | None, optional
Maximum value for coordinate filtering. Used with coordinate parameter,
by default None
drop : bool, optional
Whether to drop filtered coordinates from the result,
by default True
neighbor_lookup : MethodForInexactMatches | None, optional
Spatial interpolation method when exact coordinate matches are not found,
by default None
tolerance : float | None, optional
Maximum distance tolerance for spatial interpolation,
by default 0.1
time_tolerance : str, optional
Temporal tolerance for timestamp matching in merged datasets,
by default "15m"
variable_name_as_suffix : bool, optional
Whether to use variable name as suffix in output naming,
by default True
verbose : int, optional
Verbosity level for debug output and warnings,
by default VERBOSE_LEVEL_DEFAULT
log : int, optional
Logging level for function call logging and data fingerprinting,
by default LOG_LEVEL_DEFAULT
Returns
-------
DataArray
Location and temporally filtered time series data as an xarray DataArray.
Single timestamp selections return scalar-like DataArrays.
Raises
------
ValueError
If requested timestamps fall outside the data's temporal range, if duplicate
timestamps are detected when not using month start remapping, or if remapped
timestamps are empty.
KeyError
If no data is found for one or more requested timestamps.
Warnings
--------
Logs warnings for single timestamp selections and single value results.
Notes
-----
The function supports three temporal selection modes:
1. Timestamp-based: Uses specific timestamps for selection
2. Range-based: Uses start_time/end_time for slice selection
3. Month start remapping: Maps timestamps to month starts in 2013
Temporal validation ensures requested timestamps fall within data bounds.
The function is cached using @custom_cached decorator for performance.
Data fingerprinting is performed for debugging and validation purposes.
"""
from pvgisprototype.api.series.open import (
select_location_time_series_from_array_or_set,
)
# logger.debug(f"Data : {data}", alt=f"[bold]Data[/bold] : {data}")
if longitude and latitude:
coordinates = f"Requested location coordinates : {longitude}, {latitude}"
coordinates_alternative = (
f"[bold]Requested[/bold] location coordinates : {longitude}, {latitude}"
)
logger.debug(coordinates, alt=coordinates_alternative)
location_time_series = select_location_time_series_from_array_or_set(
data=data,
coordinate=coordinate,
minimum=minimum,
maximum=maximum,
drop=drop,
longitude=longitude,
latitude=latitude,
variable=variable,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
verbose=verbose,
# log=log,
)
logger.debug(
f"Selected location from time series : {location_time_series}",
alt=f"Selected [brown]location[/brown] from time series : {location_time_series}",
)
# ------------------------------------------------------------------------
if (start_time or end_time) and not remap_to_month_start:
timestamps = None # we don't need a timestamp anymore!
if start_time and not end_time: # set `end_time` to end of series
end_time = location_time_series.time.values[-1]
elif end_time and not start_time: # set `start_time` to beginning of series
start_time = location_time_series.time.values[0]
else: # Convert `start_time` & `end_time` to the correct string format
start_time = start_time.strftime("%Y-%m-%d %H:%M:%S")
end_time = end_time.strftime("%Y-%m-%d %H:%M:%S")
try:
location_time_series = location_time_series.sel(
time=slice(start_time, end_time)
)
except Exception:
logger.exception(
f"No data found for the given period {start_time} and {end_time}."
)
if remap_to_month_start:
logger.debug(
f"Remapping all timestaps for {data} to the reference year 2013",
alt=f"[bold]Remapping[/bold] all timestaps for {data} to the reference year 2013",
)
remapped_timestamps = timestamps.map(lambda ts: remap_to_2013(ts))
if not remapped_timestamps.empty:
from pandas import date_range
month_start_timestamps = date_range(
start=remapped_timestamps.min().normalize(),
end=remapped_timestamps.max(),
freq="MS",
)
try:
location_time_series = location_time_series.sel(
time=month_start_timestamps, method=neighbor_lookup
)
except Exception:
logger.exception(
f"No data found for the given 'month start' timestamps {month_start_timestamps}.",
alt=f"[red]No data found for the given 'month start' timestamps {month_start_timestamps}[/red].",
)
else:
error_message = "Remapped timestamps are empty, cannot proceed with date range creation."
logger.error(error_message)
raise ValueError(error_message)
if timestamps is not None and not start_time and not end_time:
data_time_min = location_time_series.time.min().values
data_time_max = location_time_series.time.max().values
if len(timestamps) == 1:
logger.warning(
f"Single timestamp selected!",
alt=f"[bold][yellow]Single timestamp selected![/bold][yellow]",
)
timestamps = start_time = end_time = timestamps[0]
try:
location_time_series = location_time_series.sel(
time=timestamps,
method=neighbor_lookup,
# tolerance=time_tolerance,
)
if (
"time" in location_time_series.coords
and location_time_series.time.size > 1
):
if location_time_series.indexes["time"].duplicated().any():
logger.error(
f"Duplicate timestamps detected in location_time_series.",
alt=f"[red]Duplicate timestamps detected in location_time_series![/red]",
)
if (
not remap_to_month_start
and location_time_series.indexes["time"].duplicated().any()
):
raise ValueError("Duplicate timestaps detected!")
logger.debug(
f"Selected timestamps from location time series : {location_time_series}",
alt=f"[bold]Selected[/bold] [blue]timestamps[/blue] from [brown]location[/brown] time series : {location_time_series}",
)
else:
logger.debug(
f"Single timestamp selected: {location_time_series.time.values}"
)
except KeyError:
error_message = f"No data found for one or more of the requested timestamps : {timestamps}."
logger.exception(
f"No data found for one or more of the requested timestamps : {timestamps}."
)
raise ValueError(error_message)
if location_time_series.size == 1:
single_value = float(location_time_series.values)
warning = (
f"{exclamation_mark} The selected timestamp "
+ f"{location_time_series.time.values}"
+ " matches the single value "
+ f"{single_value}"
)
logger.warning(warning)
if verbose > 0:
logger.warning(warning)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
log_data_fingerprint(
data=location_time_series.values,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
return location_time_series
spectral_factor ¶
Functions:
| Name | Description |
|---|---|
get_spectral_factor_series | |
get_spectral_factor_series_from_array_or_set | Extract spectral factor time series from xarray DataArray or Dataset. |
get_spectral_factor_series ¶
get_spectral_factor_series(
longitude: float,
latitude: float,
timestamps: DatetimeIndex = str(now()),
spectral_factor_series: (
SpectralFactorSeries | Path
) = array(SPECTRAL_FACTOR_DEFAULT),
neighbor_lookup: (
MethodForInexactMatches | None
) = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/series/spectral_factor.py
def get_spectral_factor_series(
longitude: float,
latitude: float,
timestamps: DatetimeIndex = str(Timestamp.now()),
spectral_factor_series: SpectralFactorSeries | Path = numpy.array(
SPECTRAL_FACTOR_DEFAULT
),
neighbor_lookup: MethodForInexactMatches | None = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
""" """
if isinstance(spectral_factor_series, Path):
from pvgisprototype.api.series.select import select_time_series
# from pvgisprototype.api.utilities.conversions import (
# convert_float_to_degrees_if_requested,
# )
# from pvgisprototype.constants import DEGREES
return SpectralFactorSeries(
value=select_time_series(
time_series=spectral_factor_series,
# longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
longitude=longitude.degrees,
# latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
latitude=latitude.degrees,
timestamps=timestamps,
remap_to_month_start=False,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype),
unit=UNITLESS,
data_source=spectral_factor_series.name,
)
else:
return spectral_factor_series
get_spectral_factor_series_from_array_or_set ¶
get_spectral_factor_series_from_array_or_set(
longitude: float,
latitude: float,
spectral_factor_series: DataArray | Dataset,
timestamps: DatetimeIndex = str(now()),
neighbor_lookup: (
MethodForInexactMatches | None
) = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Extract spectral factor time series from xarray DataArray or Dataset.
Selects and extracts spectral factor data for a specific geographic location and time period from an xarray DataArray or Dataset. Performs spatial interpolation using the specified neighbor lookup method and temporal selection based on the provided timestamps. Returns a structured SpectralFactorSeries object with proper units and metadata.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
longitude | float | Longitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
latitude | float | Latitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
spectral_factor_series | DataArray | Dataset | Input xarray DataArray or Dataset containing spectral factor data with spatial (longitude, latitude) and temporal dimensions. | required |
timestamps | DatetimeIndex | Time index for temporal selection of the data, by default str(Timestamp.now()) | str(now()) |
neighbor_lookup | MethodForInexactMatches | None | Method for spatial interpolation when exact coordinate matches are not found, by default NEIGHBOR_LOOKUP_DEFAULT | NEIGHBOR_LOOKUP_DEFAULT |
tolerance | float | None | Maximum distance tolerance for spatial interpolation, by default TOLERANCE_DEFAULT | TOLERANCE_DEFAULT |
dtype | str | Data type for the output numpy array values, by default DATA_TYPE_DEFAULT | DATA_TYPE_DEFAULT |
log | int | Logging level for debug output, by default LOG_LEVEL_DEFAULT | LOG_LEVEL_DEFAULT |
Returns:
| Type | Description |
|---|---|
SpectralFactorSeries | Structured spectral factor time series object containing: - value: 1D numpy array with spectral factor values - unit: Unitless designation (spectral factors are dimensionless) - data_source: Original data source name from input |
Raises:
| Type | Description |
|---|---|
TypeError | If spectral_factor_series is not a DataArray or Dataset. |
Source code in pvgisprototype/api/series/spectral_factor.py
def get_spectral_factor_series_from_array_or_set(
longitude: float,
latitude: float,
spectral_factor_series: DataArray | Dataset,
timestamps: DatetimeIndex = str(Timestamp.now()),
neighbor_lookup: MethodForInexactMatches | None = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""Extract spectral factor time series from xarray DataArray or Dataset.
Selects and extracts spectral factor data for a specific geographic location
and time period from an xarray DataArray or Dataset. Performs spatial
interpolation using the specified neighbor lookup method and temporal selection
based on the provided timestamps. Returns a structured SpectralFactorSeries
object with proper units and metadata.
Parameters
----------
longitude : float
Longitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
latitude : float
Latitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
spectral_factor_series : DataArray | Dataset
Input xarray DataArray or Dataset containing spectral factor data
with spatial (longitude, latitude) and temporal dimensions.
timestamps : DatetimeIndex, optional
Time index for temporal selection of the data,
by default str(Timestamp.now())
neighbor_lookup : MethodForInexactMatches | None, optional
Method for spatial interpolation when exact coordinate matches are not found,
by default NEIGHBOR_LOOKUP_DEFAULT
tolerance : float | None, optional
Maximum distance tolerance for spatial interpolation,
by default TOLERANCE_DEFAULT
dtype : str, optional
Data type for the output numpy array values,
by default DATA_TYPE_DEFAULT
log : int, optional
Logging level for debug output,
by default LOG_LEVEL_DEFAULT
Returns
-------
SpectralFactorSeries
Structured spectral factor time series object containing:
- value: 1D numpy array with spectral factor values
- unit: Unitless designation (spectral factors are dimensionless)
- data_source: Original data source name from input
Raises
------
TypeError
If spectral_factor_series is not a DataArray or Dataset.
"""
from pvgisprototype.api.series.select import select_time_series_from_array_or_set
if isinstance(spectral_factor_series, DataArray | Dataset):
from pvgisprototype.api.utilities.conversions import (
convert_float_to_degrees_if_requested,
)
from pvgisprototype.constants import DEGREES
spectral_factor_time_series = (
select_time_series_from_array_or_set(
data=spectral_factor_series,
longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if (
spectral_factor_time_series.size == 1
and spectral_factor_time_series.shape == ()
):
spectral_factor_time_series = numpy.array(
[spectral_factor_time_series], dtype=dtype
)
else:
raise TypeError("Spectral factor series must be a DataArray or Dataset.")
return SpectralFactorSeries(
value=spectral_factor_time_series,
unit=UNITLESS,
data_source=spectral_factor_series.name,
)
temperature ¶
Functions:
| Name | Description |
|---|---|
get_temperature_series | |
get_temperature_series_from_array_or_set | Extract temperature time series from xarray DataArray or Dataset. |
get_temperature_series ¶
get_temperature_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex = str(now()),
temperature_series: TemperatureSeries | Path = array(
TEMPERATURE_DEFAULT
),
neighbor_lookup: MethodForInexactMatches = nearest,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/series/temperature.py
def get_temperature_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex = str(Timestamp.now()),
temperature_series: TemperatureSeries | Path = array(TEMPERATURE_DEFAULT),
neighbor_lookup: MethodForInexactMatches = MethodForInexactMatches.nearest,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
""" """
if isinstance(temperature_series, Path):
from pvgisprototype.api.series.select import select_time_series
# from pvgisprototype.api.utilities.conversions import (
# convert_float_to_degrees_if_requested,
# )
# from pvgisprototype.constants import DEGREES
temperature_times_series = (
select_time_series(
time_series=temperature_series,
longitude=longitude.degrees,
# longitude=longitude,
latitude=latitude.degrees,
# latitude=latitude,
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if temperature_times_series.size == 1 and temperature_times_series.shape == ():
temperature_times_series = array([temperature_times_series], dtype=dtype)
return TemperatureSeries(
value=temperature_times_series,
# unit=SYMBOL_UNIT_TEMPERATURE,
data_source=temperature_series.name,
)
else:
return temperature_series
get_temperature_series_from_array_or_set ¶
get_temperature_series_from_array_or_set(
longitude: float,
latitude: float,
temperature_series: DataArray | Dataset,
timestamps: DatetimeIndex = str(now()),
neighbor_lookup: (
MethodForInexactMatches | None
) = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Extract temperature time series from xarray DataArray or Dataset.
Selects and extracts temperature data for a specific geographic location and time period from an xarray DataArray or Dataset. Performs spatial interpolation using the specified neighbor lookup method and temporal selection based on the provided timestamps. Returns a structured TemperatureSeries object with proper units and metadata.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
longitude | float | Longitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
latitude | float | Latitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
temperature_series | DataArray | Dataset | Input xarray DataArray or Dataset containing temperature data with spatial (longitude, latitude) and temporal dimensions. | required |
timestamps | DatetimeIndex | Time index for temporal selection of the data, by default str(Timestamp.now()) | str(now()) |
neighbor_lookup | MethodForInexactMatches | None | Method for spatial interpolation when exact coordinate matches are not found, by default NEIGHBOR_LOOKUP_DEFAULT | NEIGHBOR_LOOKUP_DEFAULT |
tolerance | float | None | Maximum distance tolerance for spatial interpolation, by default TOLERANCE_DEFAULT | TOLERANCE_DEFAULT |
dtype | str | Data type for the output numpy array values, by default DATA_TYPE_DEFAULT | DATA_TYPE_DEFAULT |
log | int | Logging level for debug output, by default LOG_LEVEL_DEFAULT | LOG_LEVEL_DEFAULT |
Returns:
| Type | Description |
|---|---|
TemperatureSeries | Structured temperature time series object containing: - value: 1D numpy array with temperature values - unit: Temperature unit designation (typically Celsius or Kelvin) - data_source: Original data source name from input |
Raises:
| Type | Description |
|---|---|
TypeError | If temperature_series is not a DataArray or Dataset. |
Notes
The function automatically handles coordinate conversion to ensure compatibility with the underlying data. Scalar results are converted to 1D arrays for consistency in downstream processing. Temperature units are preserved from the original dataset metadata.
Source code in pvgisprototype/api/series/temperature.py
def get_temperature_series_from_array_or_set(
longitude: float,
latitude: float,
temperature_series: DataArray | Dataset,
timestamps: DatetimeIndex = str(Timestamp.now()),
neighbor_lookup: MethodForInexactMatches | None = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""Extract temperature time series from xarray DataArray or Dataset.
Selects and extracts temperature data for a specific geographic location
and time period from an xarray DataArray or Dataset. Performs spatial
interpolation using the specified neighbor lookup method and temporal selection
based on the provided timestamps. Returns a structured TemperatureSeries
object with proper units and metadata.
Parameters
----------
longitude : float
Longitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
latitude : float
Latitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
temperature_series : DataArray | Dataset
Input xarray DataArray or Dataset containing temperature data
with spatial (longitude, latitude) and temporal dimensions.
timestamps : DatetimeIndex, optional
Time index for temporal selection of the data,
by default str(Timestamp.now())
neighbor_lookup : MethodForInexactMatches | None, optional
Method for spatial interpolation when exact coordinate matches are not found,
by default NEIGHBOR_LOOKUP_DEFAULT
tolerance : float | None, optional
Maximum distance tolerance for spatial interpolation,
by default TOLERANCE_DEFAULT
dtype : str, optional
Data type for the output numpy array values,
by default DATA_TYPE_DEFAULT
log : int, optional
Logging level for debug output,
by default LOG_LEVEL_DEFAULT
Returns
-------
TemperatureSeries
Structured temperature time series object containing:
- value: 1D numpy array with temperature values
- unit: Temperature unit designation (typically Celsius or Kelvin)
- data_source: Original data source name from input
Raises
------
TypeError
If temperature_series is not a DataArray or Dataset.
Notes
-----
The function automatically handles coordinate conversion to ensure compatibility
with the underlying data. Scalar results are converted to 1D arrays for
consistency in downstream processing. Temperature units are preserved from
the original dataset metadata.
"""
from pvgisprototype.api.series.select import select_time_series_from_array_or_set
if isinstance(temperature_series, DataArray | Dataset):
from pvgisprototype.api.utilities.conversions import (
convert_float_to_degrees_if_requested,
)
from pvgisprototype.constants import DEGREES
temperature_times_series = (
select_time_series_from_array_or_set(
data=temperature_series,
longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
# longitude=longitude,
latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
# latitude=latitude,
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if temperature_times_series.size == 1 and temperature_times_series.shape == ():
temperature_times_series = array([temperature_times_series], dtype=dtype)
else:
raise TypeError("Temperature series must be a DataArray or Dataset.")
return TemperatureSeries(
value=temperature_times_series,
data_source=temperature_series.name,
)
time_series ¶
Functions:
| Name | Description |
|---|---|
get_time_series_as_arrays_or_sets | Open (lazy load) time series datasets from file paths into xarray DataArrays or Datasets. |
get_time_series_as_arrays_or_sets ¶
get_time_series_as_arrays_or_sets(
dataset: dict,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
) -> dict
Open (lazy load) time series datasets from file paths into xarray DataArrays or Datasets.
Opens multiple time series datasets by reading each file path provided in the input dictionary and returns them as a dictionary of opened (lazy loaded) xarray objects.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
dataset | dict | Dictionary mapping dataset names to file paths. Keys are dataset names, values are file paths to be opened (lazy loaded). | required |
mask_and_scale | bool | Whether to apply mask and scale transformations during opening, by default MASK_AND_SCALE_FLAG_DEFAULT | MASK_AND_SCALE_FLAG_DEFAULT |
in_memory | bool | Whether to open datasets entirely into memory or use lazy loading, by default IN_MEMORY_FLAG_DEFAULT | IN_MEMORY_FLAG_DEFAULT |
verbose | int | Verbosity level for logging output during opening process, by default VERBOSE_LEVEL_DEFAULT | VERBOSE_LEVEL_DEFAULT |
Returns:
| Type | Description |
|---|---|
dict | Dictionary mapping dataset names to opened (lazy loaded) xarray DataArrays or Datasets. Keys match the input dataset names, values are the opened xarray objects. |
Source code in pvgisprototype/api/series/time_series.py
def get_time_series_as_arrays_or_sets(
dataset: dict,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
) -> dict:
"""Open (lazy load) time series datasets from file paths into xarray DataArrays or Datasets.
Opens multiple time series datasets by reading each file path provided in the
input dictionary and returns them as a dictionary of opened (lazy loaded) xarray objects.
Parameters
----------
dataset : dict
Dictionary mapping dataset names to file paths. Keys are dataset names,
values are file paths to be opened (lazy loaded).
mask_and_scale : bool, optional
Whether to apply mask and scale transformations during opening,
by default MASK_AND_SCALE_FLAG_DEFAULT
in_memory : bool, optional
Whether to open datasets entirely into memory or use lazy loading,
by default IN_MEMORY_FLAG_DEFAULT
verbose : int, optional
Verbosity level for logging output during opening process,
by default VERBOSE_LEVEL_DEFAULT
Returns
-------
dict
Dictionary mapping dataset names to opened (lazy loaded) xarray DataArrays or Datasets.
Keys match the input dataset names, values are the opened xarray objects.
"""
opened_dataset: dict = {}
for name, path in dataset.items():
if path is None:
opened_dataset[name] = None
continue
opened_dataset[name] = read_data_array_or_set(
input_data=path,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=verbose,
)
return opened_dataset
utilities ¶
Functions:
| Name | Description |
|---|---|
filter_xarray | Filter a Dataset or DataArray based on a given coordinate with specified minimum and/or |
get_scale_and_offset | Get scale and offset values from a netCDF file using xarray |
load_or_open_dataarray_from_dataset | Load or open a variable from a dataset and select coordinates. |
open_data_array | |
open_data_set | Open or load a dataset based on the input flags. |
read_data_array_or_set | Open the data and determine if it's a DataArray or Dataset. |
select_coordinates | Select single pair of coordinates from a data array |
select_location_time_series | Select a location from a time series data format supported by xarray |
set_location_indexers | Select single pair of coordinates from a data array |
filter_xarray ¶
filter_xarray(
data: Dataset | DataArray,
coordinate: str,
minimum: float | None,
maximum: float | None,
drop: bool = True,
) -> Dataset | DataArray
Filter a Dataset or DataArray based on a given coordinate with specified minimum and/or maximum values. If the minimum or maximum is None, the function will ignore that bound.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
data | Dataset | DataArray | The input xarray Dataset or DataArray to filter. | required |
coordinate | str | The name of the coordinate within the Dataset or DataArray to apply the filter on. | required |
minimum | float or None | The minimum value for the coordinate. If None, no lower bound is applied. | required |
maximum | float or None | The maximum value for the coordinate. If None, no upper bound is applied. | required |
drop | bool | Whether to drop values that fall outside the range, by default True. | True |
Returns:
| Type | Description |
|---|---|
Dataset | DataArray | The filtered xarray Dataset or DataArray, where values outside the [minimum, maximum] range are dropped or masked. |
Raises:
| Type | Description |
|---|---|
ValueError | If the coordinate is not present in the input data. |
Notes
- If both
minimumandmaximumare None, the input data is returned unfiltered. - Emits a warning via logger if any values exceed the bounds.
Source code in pvgisprototype/api/series/utilities.py
def filter_xarray(
data: Dataset | DataArray,
coordinate: str,
minimum: float | None,
maximum: float | None,
drop: bool = True,
) -> Dataset | DataArray:
"""
Filter a Dataset or DataArray based on a given coordinate with specified minimum and/or
maximum values. If the `minimum` or `maximum` is None, the function will ignore that bound.
Parameters
----------
data : Dataset | DataArray
The input xarray Dataset or DataArray to filter.
coordinate : str
The name of the coordinate within the Dataset or DataArray to apply the filter on.
minimum : float or None
The minimum value for the coordinate. If None, no lower bound is applied.
maximum : float or None
The maximum value for the coordinate. If None, no upper bound is applied.
drop : bool, optional
Whether to drop values that fall outside the range, by default True.
Returns
-------
Dataset | DataArray
The filtered xarray Dataset or DataArray, where values outside the
[minimum, maximum] range are dropped or masked.
Raises
------
ValueError
If the coordinate is not present in the input data.
Notes
-----
- If both `minimum` and `maximum` are None, the input data is returned unfiltered.
- Emits a warning via logger if any values exceed the bounds.
"""
if coordinate not in data.coords:
raise ValueError(f"Coordinate '{coordinate}' not found in the dataset.")
condition = True # Start with an always-true condition
if minimum is not None:
condition &= data[coordinate] >= minimum
if maximum is not None:
condition &= data[coordinate] <= maximum
# values outside the requested range ?
if numpy.any(~condition):
warning_message = f"{x_mark} The input data exceed the reference range [{minimum}, {maximum}]."
warning_alternative = f"{x_mark} [bold]The input data [red]exceed[/red] the reference range[/bold] [{minimum}, {maximum}]."
typer.echo(warning_message)
logger.warning(warning_message, alt=warning_alternative)
else:
success_message = f"{check_mark} The input data are within the reference range [{minimum}, {maximum}]."
typer.echo(success_message)
logger.debug(success_message)
return data.where(condition, drop=drop)
get_scale_and_offset ¶
Get scale and offset values from a netCDF file using xarray
Source code in pvgisprototype/api/series/utilities.py
def get_scale_and_offset(netcdf):
"""Get scale and offset values from a netCDF file using xarray"""
import xarray as xr
# Open the dataset using xarray
dataset = xr.open_dataset(netcdf)
# Get all dimensions
netcdf_dimensions = set(dataset.dims)
# Get all variables
netcdf_variables = set(dataset.data_vars)
# Assuming the first variable that is not a dimension is the target variable
variable = list(netcdf_variables.difference(netcdf_dimensions))[0]
# Get the variable's attributes
variable_attrs = dataset[variable].attrs
# Retrieve scale_factor and add_offset attributes if they exist
scale_factor = variable_attrs.get("scale_factor", None)
add_offset = variable_attrs.get("add_offset", None)
return (scale_factor, add_offset)
load_or_open_dataarray_from_dataset ¶
load_or_open_dataarray_from_dataset(
dataset: Path,
variable: str | None = None,
longitude: float | None = None,
latitude: float | None = None,
time: str | None = None,
column_numbers: str | None = None,
mask_and_scale: bool = False,
in_memory: bool = False,
method: str = "nearest",
tolerance: float = 0.1,
verbose: int = 0,
)
Load or open a variable from a dataset and select coordinates.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
dataset | Path | | required |
variable | str | None | | None |
longitude | float | None | | None |
latitude | float | None | | None |
time | str | None | | None |
mask_and_scale | bool | | False |
in_memory | bool | | False |
method | str | | 'nearest' |
tolerance | float | | 0.1 |
verbose | int | | 0 |
Source code in pvgisprototype/api/series/utilities.py
def load_or_open_dataarray_from_dataset(
dataset: Path,
variable: str | None = None,
longitude: float | None = None,
latitude: float | None = None,
time: str | None = None,
column_numbers: str | None = None,
mask_and_scale: bool = False,
in_memory: bool = False,
method: str = "nearest",
tolerance: float = 0.1,
verbose: int = 0,
):
"""
Load or open a variable from a dataset and select coordinates.
Parameters
----------
dataset: Path to the NetCDF dataset file.
variable: The variable name to extract from the dataset.
longitude: Longitude value to select.
latitude: Latitude value to select.
time: Time value to select.
mask_and_scale: Boolean to mask and scale data.
in_memory: Boolean to load dataset into memory.
method: Method for selecting nearest coordinates ('nearest').
tolerance: Tolerance level for selecting the nearest coordinate.
verbose: Verbosity level for logging.
"""
# Open the dataset
ds = open_data_set(
input_data=dataset,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=verbose,
)
# If a variable is specified, check and extract it, otherwise raise error
if variable:
if variable in ds.variables:
data_array = ds[variable]
else:
logger.error(f"{x_mark} Variable '{variable}' not found in the dataset!")
raise typer.Exit(code=33)
else:
logger.error(f"{x_mark} No variable specified!")
raise typer.Exit(code=33)
# Select coordinates for longitude, latitude, and time if provided
indexers = {}
if "longitude" in ds.coords and longitude:
indexers["longitude"] = longitude
elif "lon" in ds.coords and longitude:
indexers["lon"] = longitude
if "latitude" in ds.coords and latitude:
indexers["latitude"] = latitude
elif "lat" in ds.coords and latitude:
indexers["lat"] = latitude
if time:
indexers["time"] = time
# Apply selection using nearest method and tolerance if required
try:
data_array = data_array.sel(**indexers, method=method, tolerance=tolerance)
except Exception as e:
logger.error(f"Error in selecting data with given coordinates: {str(e)}")
raise typer.Exit(code=33)
if column_numbers:
try:
if "-" in column_numbers: # Handle range like '1-10'
start, end = map(int, column_numbers.split("-"))
data_array = data_array.isel(
center_wavelength=slice(start - 1, end)
) # Adjust to 0-based indexing
elif "," in column_numbers: # Handle list like '1,5,7'
indices = list(map(int, column_numbers.split(",")))
data_array = data_array.isel(
center_wavelength=[i - 1 for i in indices]
) # Adjust to 0-based indexing
else: # Handle single value like '1'
index = int(column_numbers) - 1 # Adjust to 0-based indexing
data_array = data_array.isel(center_wavelength=index)
except Exception as e:
logger.error(f"Error in processing column_numbers: {str(e)}")
raise typer.Exit(code=33)
if verbose > 0:
logger.debug(f"Data successfully loaded for variable '{variable}'.")
return data_array
open_data_array ¶
open_data_array(
input_data: str,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/series/utilities.py
def open_data_array(
input_data: str,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
""" """
# try:
# if in_memory:
# dataarray = xr.load_dataarray(
# filename_or_object=netcdf,
# mask_and_scale=mask_and_scale,
# )
# return dataarray
# except Exception as exc:
# typer.echo(f"Could not load the data in memory: {str(exc)}")
# try:
# dataarray = xr.open_dataarray(
# filename_or_object=input_data_file,
# mask_and_scale=mask_and_scale,
# )
# return dataarray
# except Exception as exc:
# typer.echo(f"Could not open the data: {str(exc)}")
# raise typer.Exit(code=33)
if in_memory:
if verbose > 0:
logger.debug(f"Loading data array '{input_data}' in memory...")
return load_or_open_dataarray(
function=xr.load_dataarray,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
else:
if verbose > 0:
logger.debug(f"Opening data array '{input_data}'...")
return load_or_open_dataarray(
function=xr.open_dataarray,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
open_data_set ¶
open_data_set(
input_data: Path,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = 0,
)
Open or load a dataset based on the input flags.
Source code in pvgisprototype/api/series/utilities.py
def open_data_set(
input_data: Path,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = 0,
):
"""Open or load a dataset based on the input flags."""
if in_memory:
if verbose > 0:
logger.debug(f"Loading dataset '{input_data}' in memory...")
return load_or_open_dataset(
function=xr.load_dataset,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
else:
if verbose > 0:
logger.debug(f"Opening dataset '{input_data}'...")
return load_or_open_dataset(
function=xr.open_dataset,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
read_data_array_or_set ¶
read_data_array_or_set(
input_data: Path,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = 0,
)
Open the data and determine if it's a DataArray or Dataset.
Source code in pvgisprototype/api/series/utilities.py
def read_data_array_or_set(
input_data: Path,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = 0,
):
"""Open the data and determine if it's a DataArray or Dataset."""
# try reading an array
try:
if in_memory:
if verbose > 0:
logger.debug(
f" - {exclamation_mark} Trying to load {input_data} into memory as a DataArray...",
alt=f" - {exclamation_mark} [bold]Trying[/bold] to load {input_data} into memory as a DataArray...",
)
return load_or_open_dataarray(
function=xr.load_dataarray,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
else:
if verbose > 0:
logger.debug(
f" - {exclamation_mark} Trying to open {input_data} as a DataArray...",
alt=f" - {exclamation_mark} [bold]Trying[/bold] to open {input_data} as a DataArray...",
)
return load_or_open_dataarray(
function=xr.open_dataarray,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
# or a set
except:
try:
if in_memory:
if verbose > 0:
logger.debug(
f" - {exclamation_mark} Trying to load {input_data} into memory as a Dataset...",
alt=f" - {exclamation_mark} [bold]Trying[/bold] to load {input_data} into memory as a Dataset...",
)
return load_or_open_dataset(
function=xr.load_dataset,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
else:
if verbose > 0:
logger.debug(
f" - {exclamation_mark} Trying to open {input_data} as a Dataset...",
alt=f" - {exclamation_mark} [bold]Trying[/bold] to open {input_data} as a Dataset...",
)
return load_or_open_dataset(
function=xr.open_dataset,
filename_or_object=input_data,
mask_and_scale=mask_and_scale,
)
except Exception as e:
logger.error(
f"Error loading or opening data: {str(e)}",
alt=f"Error loading or opening data: {str(e)}",
)
raise typer.Exit(code=33)
select_coordinates ¶
select_coordinates(
data_array,
longitude: Longitude,
latitude: Latitude,
time: str | None = None,
method: str = "nearest",
tolerance: float = 0.1,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Select single pair of coordinates from a data array
Will select center coordinates if none of (longitude, latitude) are provided.
Source code in pvgisprototype/api/series/utilities.py
def select_coordinates(
data_array,
longitude: Longitude,
latitude: Latitude,
time: str | None = None,
method: str = "nearest",
tolerance: float = 0.1,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
"""Select single pair of coordinates from a data array
Will select center coordinates if none of (longitude, latitude) are
provided.
"""
indexers = set_location_indexers(
data_array=data_array,
longitude=longitude,
latitude=latitude,
verbose=verbose,
)
try:
if not time:
data_array = data_array.sel(
**indexers,
method=method,
)
else:
# Review-Me ------------------------------------------------------
data_array = data_array.sel(time=time, method=method).sel(
**indexers,
method=method,
tolerance=tolerance,
)
# Review-Me ------------------------------------------------------
except Exception as exception:
print(f"{x_mark} {ERROR_IN_SELECTING_DATA} : {exception}")
raise SystemExit(33)
return data_array
select_location_time_series ¶
select_location_time_series(
time_series: Path,
variable: str | None = None,
coordinate: str | None = None,
minimum: float | None = None,
maximum: float | None = None,
drop: bool = True,
longitude: Longitude = None,
latitude: Latitude = None,
neighbor_lookup: (
MethodForInexactMatches | None
) = nearest,
tolerance: float = 0.1,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
) -> DataArray
Select a location from a time series data format supported by xarray
Source code in pvgisprototype/api/series/utilities.py
@log_function_call
def select_location_time_series(
time_series: Path,
variable: str | None = None,
coordinate: str | None = None,
minimum: float | None = None,
maximum: float | None = None,
drop: bool = True,
longitude: Longitude = None,
latitude: Latitude = None,
neighbor_lookup: MethodForInexactMatches | None = MethodForInexactMatches.nearest,
tolerance: float = 0.1,
mask_and_scale: bool = False,
in_memory: bool = False,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
) -> DataArray:
"""Select a location from a time series data format supported by xarray"""
context_message = (
f"i Executing data selection function : select_location_time_series()"
)
context_message_alternative = f"[yellow]i[/yellow] Executing [underline]data selection function[/underline] : select_location_time_series()"
logger.debug(context_message, alt=context_message_alternative)
# data_array = open_data_array(
# time_series,
# mask_and_scale,
# in_memory,
# )
data = read_data_array_or_set(
input_data=time_series,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=verbose,
)
if isinstance(data, xr.Dataset):
if not variable:
raise ValueError(
"You must specify a variable when selecting from a Dataset."
)
if variable not in data:
raise ValueError(f"Variable '{variable}' not found in the Dataset.")
data_array = data[variable] # Extract the DataArray from the Dataset
logger.debug(
f" {check_mark} Successfully extracted '{variable}' from '{data_array.name}'.",
alt=f" {check_mark} [green]Successfully[/green] extracted '{variable}' from '{data_array.name}'.",
)
elif isinstance(data, xr.DataArray):
data_array = data # It's already a DataArray, use it directly
else:
raise ValueError("Unsupported data type. Must be a DataArray or Dataset.")
# Is this correctly placed here ?
if coordinate and (minimum or maximum):
data_array = filter_xarray(
data=data_array,
coordinate=coordinate,
minimum=minimum,
maximum=maximum,
drop=drop,
)
indexers = set_location_indexers(
data_array=data_array,
longitude=longitude,
latitude=latitude,
verbose=verbose,
)
try:
location_time_series = data_array.sel(
**indexers,
method=neighbor_lookup,
tolerance=tolerance,
)
if location_time_series.isnull().all():
logger.warning("Selection returns an empty array or all NaNs.")
location_time_series.load() # load into memory for fast processing
except Exception as exception:
# Print the error message directly to stderr to ensure it's always shown
error_message = f"Error in selecting data from {time_series} : {exception}."
error_message_alternative = (
f"Error in selecting data from [code]{time_series}[/code] : {exception}."
)
print(f"{error_message}\n")
logger.error(
error_message,
alt=error_message_alternative,
)
raise SystemExit(33)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
logger.debug(
f" < Returning selected location from time series : {location_time_series}",
alt=f" [green bold]<[/green bold] [bold]Returning[/bold] selected [brown]location[/brown] from time series : {location_time_series}",
)
return location_time_series
set_location_indexers ¶
set_location_indexers(
data_array,
longitude: Longitude = None,
latitude: Latitude = None,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Select single pair of coordinates from a data array
Will select center coordinates if none of (longitude, latitude) are provided.
Source code in pvgisprototype/api/series/utilities.py
def set_location_indexers(
data_array,
longitude: Longitude = None,
latitude: Latitude = None,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
"""Select single pair of coordinates from a data array
Will select center coordinates if none of (longitude, latitude) are
provided.
"""
# ----------------------------------------------------------- Deduplicate me
# Ugly hack for when dimensions 'longitude', 'latitude' are not spelled out!
# Use `coords` : a time series of a single pair of coordinates has only a `time` dimension!
indexers = {}
dimensions = [
dimension for dimension in data_array.coords if isinstance(dimension, str)
]
if set(["lon", "lat"]) & set(dimensions):
x = "lon"
y = "lat"
elif set(["longitude", "latitude"]) & set(dimensions):
x = "longitude"
y = "latitude"
if x and y:
logger.debug(
f" {check_mark} Location specific dimensions detected in '{data_array.name}' : {x}, {y}"
)
if not (longitude and latitude):
warning = f" {check_mark} Coordinates (longitude, latitude) not provided. Selecting center coordinates."
logger.warning(warning)
center_longitude = float(data_array[x][len(data_array[x]) // 2])
center_latitude = float(data_array[y][len(data_array[y]) // 2])
indexers[x] = center_longitude
indexers[y] = center_latitude
text_coordinates = f"{check_mark} Center coordinates (longitude, latitude) : {center_longitude}, {center_latitude}."
else:
indexers[x] = longitude
indexers[y] = latitude
text_coordinates = f" {check_mark} Coordinates : {longitude}, {latitude}."
logger.debug(text_coordinates)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return indexers
wind_speed ¶
Functions:
| Name | Description |
|---|---|
get_wind_speed_series | |
get_wind_speed_series_from_array_or_set | Extract wind speed time series from xarray DataArray or Dataset. |
get_wind_speed_series ¶
get_wind_speed_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex = str(now()),
wind_speed_series: WindSpeedSeries | Path = array(
TEMPERATURE_DEFAULT
),
neighbor_lookup: MethodForInexactMatches = nearest,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Source code in pvgisprototype/api/series/wind_speed.py
def get_wind_speed_series(
longitude: Longitude,
latitude: Latitude,
timestamps: DatetimeIndex = str(Timestamp.now()),
wind_speed_series: WindSpeedSeries | Path = array(TEMPERATURE_DEFAULT),
neighbor_lookup: MethodForInexactMatches = MethodForInexactMatches.nearest,
tolerance: float | None = TOLERANCE_DEFAULT,
mask_and_scale: bool = MASK_AND_SCALE_FLAG_DEFAULT,
in_memory: bool = IN_MEMORY_FLAG_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
multi_thread: bool = MULTI_THREAD_FLAG_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
""" """
if isinstance(wind_speed_series, Path):
from pvgisprototype.api.series.select import select_time_series
# from pvgisprototype.api.utilities.conversions import (
# convert_float_to_degrees_if_requested,
# )
# from pvgisprototype.constants import DEGREES
wind_speed_time_series = (
select_time_series(
time_series=wind_speed_series,
# longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
longitude=longitude.degrees,
# latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
latitude=latitude.degrees,
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
mask_and_scale=mask_and_scale,
in_memory=in_memory,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if wind_speed_time_series.size == 1 and wind_speed_time_series.shape == ():
wind_speed_time_series = array([wind_speed_time_series], dtype=dtype)
return WindSpeedSeries(
value=wind_speed_time_series,
# unit=SYMBOL_UNIT_WIND_SPEED,
data_source=wind_speed_series.name,
)
else:
return wind_speed_series
get_wind_speed_series_from_array_or_set ¶
get_wind_speed_series_from_array_or_set(
longitude: float,
latitude: float,
wind_speed_series: DataArray | Dataset,
timestamps: DatetimeIndex = str(now()),
neighbor_lookup: (
MethodForInexactMatches | None
) = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
)
Extract wind speed time series from xarray DataArray or Dataset.
Selects and extracts wind speed data for a specific geographic location and time period from an xarray DataArray or Dataset. Performs spatial interpolation using the specified neighbor lookup method and temporal selection based on the provided timestamps. Returns a structured WindSpeedSeries object with proper units and metadata.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
longitude | float | Longitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
latitude | float | Latitude coordinate for data extraction (in degrees or radians). Will be converted to degrees internally if needed. | required |
wind_speed_series | DataArray | Dataset | Input xarray DataArray or Dataset containing wind speed data with spatial (longitude, latitude) and temporal dimensions. | required |
timestamps | DatetimeIndex | Time index for temporal selection of the data, by default str(Timestamp.now()) | str(now()) |
neighbor_lookup | MethodForInexactMatches | None | Method for spatial interpolation when exact coordinate matches are not found, by default NEIGHBOR_LOOKUP_DEFAULT | NEIGHBOR_LOOKUP_DEFAULT |
tolerance | float | None | Maximum distance tolerance for spatial interpolation, by default TOLERANCE_DEFAULT | TOLERANCE_DEFAULT |
dtype | str | Data type for the output numpy array values, by default DATA_TYPE_DEFAULT | DATA_TYPE_DEFAULT |
log | int | Logging level for debug output, by default LOG_LEVEL_DEFAULT | LOG_LEVEL_DEFAULT |
Returns:
| Type | Description |
|---|---|
WindSpeedSeries | Structured wind speed time series object containing: - value: 1D numpy array with wind speed values - unit: Wind speed unit designation (typically m/s or km/h) - data_source: Original data source name from input |
Raises:
| Type | Description |
|---|---|
TypeError | If wind_speed_series is not a DataArray or Dataset. |
Notes
Wind speed data is typically measured at a standard height (often 10 or 2 meters) above ground level. The function automatically handles coordinate conversion to ensure compatibility with the underlying data. Scalar results are converted to 1D arrays for consistency in downstream processing.
Source code in pvgisprototype/api/series/wind_speed.py
def get_wind_speed_series_from_array_or_set(
longitude: float,
latitude: float,
wind_speed_series: DataArray | Dataset,
timestamps: DatetimeIndex = str(Timestamp.now()),
neighbor_lookup: MethodForInexactMatches | None = NEIGHBOR_LOOKUP_DEFAULT,
tolerance: float | None = TOLERANCE_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
):
"""Extract wind speed time series from xarray DataArray or Dataset.
Selects and extracts wind speed data for a specific geographic location
and time period from an xarray DataArray or Dataset. Performs spatial
interpolation using the specified neighbor lookup method and temporal selection
based on the provided timestamps. Returns a structured WindSpeedSeries
object with proper units and metadata.
Parameters
----------
longitude : float
Longitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
latitude : float
Latitude coordinate for data extraction (in degrees or radians).
Will be converted to degrees internally if needed.
wind_speed_series : DataArray | Dataset
Input xarray DataArray or Dataset containing wind speed data
with spatial (longitude, latitude) and temporal dimensions.
timestamps : DatetimeIndex, optional
Time index for temporal selection of the data,
by default str(Timestamp.now())
neighbor_lookup : MethodForInexactMatches | None, optional
Method for spatial interpolation when exact coordinate matches are not found,
by default NEIGHBOR_LOOKUP_DEFAULT
tolerance : float | None, optional
Maximum distance tolerance for spatial interpolation,
by default TOLERANCE_DEFAULT
dtype : str, optional
Data type for the output numpy array values,
by default DATA_TYPE_DEFAULT
log : int, optional
Logging level for debug output,
by default LOG_LEVEL_DEFAULT
Returns
-------
WindSpeedSeries
Structured wind speed time series object containing:
- value: 1D numpy array with wind speed values
- unit: Wind speed unit designation (typically m/s or km/h)
- data_source: Original data source name from input
Raises
------
TypeError
If wind_speed_series is not a DataArray or Dataset.
Notes
-----
Wind speed data is typically measured at a standard height (often 10 or 2 meters) above
ground level. The function automatically handles coordinate conversion to ensure
compatibility with the underlying data. Scalar results are converted to 1D arrays
for consistency in downstream processing.
"""
from pvgisprototype.api.series.select import select_time_series_from_array_or_set
if isinstance(wind_speed_series, DataArray | Dataset):
from pvgisprototype.api.utilities.conversions import (
convert_float_to_degrees_if_requested,
)
from pvgisprototype.constants import DEGREES
wind_speed_time_series = (
select_time_series_from_array_or_set(
data=wind_speed_series,
longitude=convert_float_to_degrees_if_requested(longitude, DEGREES),
latitude=convert_float_to_degrees_if_requested(latitude, DEGREES),
timestamps=timestamps,
# convert_longitude_360=convert_longitude_360,
neighbor_lookup=neighbor_lookup,
tolerance=tolerance,
verbose=0, # no verbosity here by choice!
log=log,
)
.to_numpy()
.astype(dtype=dtype)
)
if wind_speed_time_series.size == 1 and wind_speed_time_series.shape == ():
wind_speed_time_series = array([wind_speed_time_series], dtype=dtype)
else:
raise TypeError("Wind speed series must be a DataArray or Dataset.")
return WindSpeedSeries(
value=wind_speed_time_series,
data_source=wind_speed_series.name,
)
spectrum ¶
Modules:
| Name | Description |
|---|---|
helpers_pelland | |
spectral_effect | |
spectral_mismatch | |
helpers_pelland ¶
Functions:
| Name | Description |
|---|---|
adjust_band_limits | |
generate_banded_data | |
adjust_band_limits ¶
adjust_band_limits(
bands: DataFrame,
min_wavelength: float,
max_wavelength: float,
lower_band_wavelength_limit_name: str = "Lower limit [nm]",
center_band_wavelength_limit_name: str = "Center [nm]",
upper_band_wavelength_limit_name: str = "Upper limit [nm]",
dtype: str = DATA_TYPE_DEFAULT,
) -> DataFrame
Source code in pvgisprototype/api/spectrum/helpers_pelland.py
def adjust_band_limits(
bands: DataFrame,
min_wavelength: float,
max_wavelength: float,
lower_band_wavelength_limit_name: str = "Lower limit [nm]",
center_band_wavelength_limit_name: str = "Center [nm]",
upper_band_wavelength_limit_name: str = "Upper limit [nm]",
dtype: str = DATA_TYPE_DEFAULT,
) -> DataFrame:
""" """
# bands = bands.astype(float)
bands = bands.astype(dtype)
bands = bands[
numpy.logical_and(
min_wavelength < bands[upper_band_wavelength_limit_name],
max_wavelength > bands[lower_band_wavelength_limit_name],
)
]
bands.reset_index(inplace=True, drop=True)
# Adjust the lower limit of the first band using .loc[]
# bands.iloc[0,:][lower_band_wavelength_limit_name] = max(min_wavelength,bands.iloc[0,:][lower_band_wavelength_limit_name])
bands.loc[bands.index[0], lower_band_wavelength_limit_name] = max(
min_wavelength, bands.loc[bands.index[0], lower_band_wavelength_limit_name]
)
# Adjust the upper limit of the last band using .loc[]
# bands.iloc[len(bands)-1,:][upper_band_wavelength_limit_name] = min(max_wavelength,bands.iloc[len(bands)-1,:][upper_band_wavelength_limit_name])
bands.loc[bands.index[-1], upper_band_wavelength_limit_name] = min(
max_wavelength, bands.loc[bands.index[-1], upper_band_wavelength_limit_name]
)
return bands
generate_banded_data ¶
generate_banded_data(
reference_bands,
spectral_data,
data_type,
lower_band_wavelength_limit_name: str = "Lower limit [nm]",
center_band_wavelength_name: str = "Center [nm]",
upper_band_wavelength_limit_name: str = "Upper limit [nm]",
dtype: str = DATA_TYPE_DEFAULT,
)
Source code in pvgisprototype/api/spectrum/helpers_pelland.py
def generate_banded_data(
reference_bands,
spectral_data,
data_type,
lower_band_wavelength_limit_name: str = 'Lower limit [nm]',
center_band_wavelength_name: str = 'Center [nm]',
upper_band_wavelength_limit_name: str = 'Upper limit [nm]',
dtype: str = DATA_TYPE_DEFAULT,
):
"""
"""
# Make a copy of original data to keep it unmodified
data = spectral_data.copy()
# Add missing reference band edges in the data
for band_edge in reference_bands[lower_band_wavelength_limit_name].tolist() + [
reference_bands[upper_band_wavelength_limit_name].iloc[-1]
]:
if band_edge not in data.columns:
closest_smaller_edge = max(
[column for column in data.columns if column < band_edge]
)
# Insert new edge after the closest smaller one
data.insert(
data.columns.get_loc(closest_smaller_edge) + 1, band_edge, numpy.nan
)
# Now do dataframe interpolation to get the values at the band edges
data = data.apply(to_numeric)
data.interpolate(method="values", axis=1, inplace=True)
# Do numerical integration (trapezoidal) to get total for each band
banded_data = DataFrame(
data=numpy.nan,
index=data.index,
columns=reference_bands[center_band_wavelength_name],
dtype=dtype,
)
# Compute one column at a time
for col in numpy.arange(0, len(reference_bands)):
col_list = [
idx
for idx in range(len(data.columns))
if (
data.columns[idx] >= reference_bands[lower_band_wavelength_limit_name][col]
and data.columns[idx] <= reference_bands[upper_band_wavelength_limit_name][col]
)
]
if data_type == "responsivity":
banded_data[reference_bands[center_band_wavelength_name][col]] = integrate(
data.iloc[:, col_list]
) / (
reference_bands[upper_band_wavelength_limit_name][col]
- reference_bands[lower_band_wavelength_limit_name][col]
)
# Rename columns ! Ugly Hacks ---------------------------------------
if (
banded_data.columns.name == center_band_wavelength_name
or banded_data.columns.name == "Wavelength"
):
banded_data.columns.name = "center_wavelength"
elif data_type == "spectrum":
banded_data[reference_bands[center_band_wavelength_name][col]] = integrate(
data.iloc[:, col_list]
)
return banded_data
spectral_effect ¶
Functions:
| Name | Description |
|---|---|
calculate_spectral_factor | Calculate an overview of solar position parameters for a time series. |
model_spectral_factor | |
calculate_spectral_factor ¶
calculate_spectral_factor(
longitude: Longitude,
latitude: Latitude,
elevation: Elevation,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
irradiance: DataFrame,
average_irradiance_density: DataFrame,
responsivity: Dict[str, Series],
photovoltaic_module_type: List[
PhotovoltaicModuleSpectralResponsivityModel
] = [cSi],
reference_spectrum: Series = None,
integrate_reference_spectrum: bool = False,
spectral_factor_models: List[SpectralMismatchModel] = [
pvlib
],
min_wavelength: float = MIN_WAVELENGTH,
max_wavelength: float = MAX_WAVELENGTH,
resample_large_series: bool = False,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> SpectralFactorSeries
Calculate an overview of solar position parameters for a time series.
Source code in pvgisprototype/api/spectrum/spectral_effect.py
def calculate_spectral_factor(
longitude: Longitude,
latitude: Latitude,
elevation: Elevation,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
irradiance: DataFrame,
average_irradiance_density: DataFrame,
# neighbor_lookup: MethodForInexactMatches = MethodForInexactMatches.nearest,
# tolerance: None | float = TOLERANCE_DEFAULT,
# mask_and_scale: bool = False,
# in_memory: bool = False,
# responsivity: Series,
responsivity: Dict[str, Series], # Dictionary to hold responsivity for each type
photovoltaic_module_type: List[PhotovoltaicModuleSpectralResponsivityModel] = [PhotovoltaicModuleSpectralResponsivityModel.cSi],
reference_spectrum: Series = None, # AM15G_IEC60904_3_ED4,
integrate_reference_spectrum: bool = False,
spectral_factor_models: List[SpectralMismatchModel] = [SpectralMismatchModel.pvlib],
min_wavelength: float = MIN_WAVELENGTH,
max_wavelength: float = MAX_WAVELENGTH,
# dtype: Annotated[str, typer_option_dtype] = DATA_TYPE_DEFAULT,
# array_backend: Annotated[str, typer_option_array_backend] = ARRAY_BACKEND_DEFAULT,
# multi_thread: Annotated[
# bool, typer_option_multi_thread
# ] = MULTI_THREAD_FLAG_DEFAULT,
resample_large_series: bool = False,
# dtype: str = DATA_TYPE_DEFAULT,
# array_backend: str = ARRAY_BACKEND_DEFAULT,
# multi_thread=multi_thread,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> SpectralFactorSeries:
"""Calculate an overview of solar position parameters for a time series.
"""
results = {}
# ctx.params.get() does not return a list !
photovoltaic_module_types = select_models(
PhotovoltaicModuleSpectralResponsivityModel, photovoltaic_module_type
)# Using a callback fails!
for spectral_factor_model in spectral_factor_models:
if (
spectral_factor_model != SpectralMismatchModel.all
): # ignore 'all' in the enumeration
model_results = {} # store model output
for module_type in photovoltaic_module_types:
selected_responsivity = responsivity[module_type.value]
spectral_factor_series = model_spectral_factor(
# longitude=longitude,
# latitude=latitude,
timestamps=timestamps,
timezone=timezone,
spectral_factor_model=spectral_factor_model,
responsivity=selected_responsivity,
# responsivity=responsivity,
irradiance=irradiance,
average_irradiance_density=average_irradiance_density,
reference_spectrum=reference_spectrum,
# dtype=dtype,
# array_backend=array_backend,
verbose=verbose,
log=log,
)
components_container = {
"Metadata": lambda: {
RESPONSIVITY_COLUMN_NAME: responsivity[module_type.value],
IRRADIANCE_COLUMN_NAME: irradiance,
IRRADIANCE_SOURCE_COLUMN_NAME: 'UpdateMe',
REFERENCE_SPECTRUM_COLUMN_NAME: 'See constant AM15G_IEC60904_3_ED4',
}
if verbose > 2
else {},
"Spectral factor": lambda: {
TITLE_KEY_NAME: SPECTRAL_FACTOR_NAME,
SPECTRAL_FACTOR_COLUMN_NAME: spectral_factor_series.value,
TECHNOLOGY_NAME: module_type.name,
SPECTRAL_FACTOR_MODEL_COLUMN_NAME: (
spectral_factor_model
if spectral_factor_model
else NOT_AVAILABLE
),
UNIT_NAME: UNITLESS,
}, # if verbose > 0 else {},
"fingerprint": lambda: {
FINGERPRINT_COLUMN_NAME: generate_hash(spectral_factor_series),
}
if fingerprint
else {},
}
components = {}
for key, component in components_container.items():
components.update(component())
components = components | spectral_factor_series.components
model_results[module_type] = components
results[spectral_factor_model] = model_results
# results = results | spectral_factor_model_overview
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
# log_data_fingerprint(
# data=spectral_factor_series,
# log_level=log,
# hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
# )
return SpectralFactorSeries(
# value=spectral_factor_series,
components=results,
)
model_spectral_factor ¶
model_spectral_factor(
timestamps: DatetimeIndex,
timezone: ZoneInfo,
responsivity: Series,
irradiance: DataFrame | DataArray,
average_irradiance_density: (
None | DataFrame | DataArray
),
min_wavelength: float = MIN_WAVELENGTH,
max_wavelength: float = MAX_WAVELENGTH,
reference_spectrum: None | Series = None,
integrate_reference_spectrum: bool = False,
spectral_factor_model: SpectralMismatchModel = pvlib,
resample_large_series: bool = False,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
)
Source code in pvgisprototype/api/spectrum/spectral_effect.py
@log_function_call
def model_spectral_factor(
# longitude: Longitude,
# latitude: Latitude,
# elevation: Elevation,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
responsivity: Series,
irradiance: DataFrame | DataArray,
average_irradiance_density: None | DataFrame | DataArray,
# neighbor_lookup: MethodForInexactMatches = MethodForInexactMatches.nearest,
# tolerance: None | float = TOLERANCE_DEFAULT,
# mask_and_scale: bool = False,
# in_memory: bool = False,
min_wavelength: float = MIN_WAVELENGTH,
max_wavelength: float = MAX_WAVELENGTH,
reference_spectrum: None | Series = None, # AM15G_IEC60904_3_ED4,
integrate_reference_spectrum: bool = False,
spectral_factor_model: SpectralMismatchModel = SpectralMismatchModel.pvlib,
# dtype: Annotated[str, typer_option_dtype] = DATA_TYPE_DEFAULT,
# array_backend: Annotated[str, typer_option_array_backend] = ARRAY_BACKEND_DEFAULT,
# multi_thread: Annotated[
# bool, typer_option_multi_thread
# ] = MULTI_THREAD_FLAG_DEFAULT,
resample_large_series: bool = False,
# dtype: str = DATA_TYPE_DEFAULT,
# array_backend: str = ARRAY_BACKEND_DEFAULT,
# multi_thread=multi_thread,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
):
"""
"""
spectral_factor = None
if spectral_factor_model.value == SpectralMismatchModel.pvlib:
pass
# average_irradiance_density_without_geographic_coordinates = (
# average_irradiance_density.drop_vars(
# ["longitude", "latitude"], errors="ignore"
# )
# )
# spectral_factor = calc_spectral_mismatch_field(
# sr=responsivity.to_dataframe().T, #T, # for one PV module technology
# e_sun=average_irradiance_density_without_geographic_coordinates.to_dataframe().T,
# e_ref=reference_spectrum,
# ).to_numpy() # Very Important !
if spectral_factor_model.value == SpectralMismatchModel.pelland:
spectral_factor = calculate_spectral_factor_pelland(
irradiance=irradiance,
responsivity=responsivity.T,
reference_spectrum=reference_spectrum,
)
return spectral_factor
spectral_mismatch ¶
Functions:
| Name | Description |
|---|---|
calculate_spectral_mismatch | Calculate an overview of solar position parameters for a time series. |
model_spectral_mismatch | |
calculate_spectral_mismatch ¶
calculate_spectral_mismatch(
longitude: Longitude,
latitude: Latitude,
elevation: Elevation,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
irradiance: DataFrame,
average_irradiance_density: DataFrame,
responsivity: Dict[str, Series],
photovoltaic_module_type: List[
PhotovoltaicModuleSpectralResponsivityModel
] = [cSi],
reference_spectrum: Series = None,
integrate_reference_spectrum: bool = False,
spectral_mismatch_models: List[
SpectralMismatchModel
] = [pvlib],
min_wavelength: float = MIN_WAVELENGTH,
max_wavelength: float = MAX_WAVELENGTH,
resample_large_series: bool = False,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> Dict
Calculate an overview of solar position parameters for a time series.
Source code in pvgisprototype/api/spectrum/spectral_mismatch.py
def calculate_spectral_mismatch(
longitude: Longitude,
latitude: Latitude,
elevation: Elevation,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
irradiance: DataFrame,
average_irradiance_density: DataFrame,
# neighbor_lookup: MethodForInexactMatches = MethodForInexactMatches.nearest,
# tolerance: None | float = TOLERANCE_DEFAULT,
# mask_and_scale: bool = False,
# in_memory: bool = False,
# responsivity: Series,
responsivity: Dict[str, Series], # Dictionary to hold responsivity for each type
photovoltaic_module_type: List[PhotovoltaicModuleSpectralResponsivityModel] = [PhotovoltaicModuleSpectralResponsivityModel.cSi],
reference_spectrum: Series = None, # AM15G_IEC60904_3_ED4,
integrate_reference_spectrum: bool = False,
spectral_mismatch_models: List[SpectralMismatchModel] = [SpectralMismatchModel.pvlib],
min_wavelength: float = MIN_WAVELENGTH,
max_wavelength: float = MAX_WAVELENGTH,
# dtype: Annotated[str, typer_option_dtype] = DATA_TYPE_DEFAULT,
# array_backend: Annotated[str, typer_option_array_backend] = ARRAY_BACKEND_DEFAULT,
# multi_thread: Annotated[
# bool, typer_option_multi_thread
# ] = MULTI_THREAD_FLAG_DEFAULT,
resample_large_series: bool = False,
# dtype: str = DATA_TYPE_DEFAULT,
# array_backend: str = ARRAY_BACKEND_DEFAULT,
# multi_thread=multi_thread,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
) -> Dict:
"""Calculate an overview of solar position parameters for a time series.
"""
results = {}
# ctx.params.get() does not return a list !
photovoltaic_module_types = select_models(
PhotovoltaicModuleSpectralResponsivityModel, photovoltaic_module_type
)# Using a callback fails!
for spectral_mismatch_model in spectral_mismatch_models:
if (
spectral_mismatch_model != SpectralMismatchModel.all
): # ignore 'all' in the enumeration
model_results = {} # store model output
for module_type in photovoltaic_module_types:
selected_responsivity = responsivity[module_type.value]
spectral_mismatch_series = model_spectral_mismatch(
# longitude=longitude,
# latitude=latitude,
timestamps=timestamps,
timezone=timezone,
spectral_mismatch_model=spectral_mismatch_model,
responsivity=selected_responsivity,
# responsivity=responsivity,
irradiance=irradiance,
average_irradiance_density=average_irradiance_density,
reference_spectrum=reference_spectrum,
# dtype=dtype,
# array_backend=array_backend,
verbose=verbose,
log=log,
)
components_container = {
"Metadata": lambda: {
RESPONSIVITY_COLUMN_NAME: responsivity[module_type.value],
IRRADIANCE_COLUMN_NAME: irradiance,
IRRADIANCE_SOURCE_COLUMN_NAME: 'UpdateMe',
REFERENCE_SPECTRUM_COLUMN_NAME: 'See constant AM15G_IEC60904_3_ED4',
}
if verbose > 2
else {},
"Spectral factor": lambda: {
TITLE_KEY_NAME: SPECTRAL_FACTOR_NAME,
SPECTRAL_FACTOR_COLUMN_NAME: spectral_mismatch_series.value,
TECHNOLOGY_NAME: module_type.name,
SPECTRAL_MISMATCH_MODEL_COLUMN_NAME: (
spectral_mismatch_model
if spectral_mismatch_model
else NOT_AVAILABLE
),
UNIT_NAME: UNITLESS,
}, # if verbose > 0 else {},
"fingerprint": lambda: {
FINGERPRINT_COLUMN_NAME: generate_hash(spectral_mismatch_series),
}
if fingerprint
else {},
}
components = {}
for key, component in components_container.items():
components.update(component())
components = components | spectral_mismatch_series.components
model_results[module_type] = components
results[spectral_mismatch_model] = model_results
# results = results | spectral_mismatch_model_overview
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
# log_data_fingerprint(
# data=spectral_mismatch_series,
# log_level=log,
# hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
# )
return SpectralFactorSeries(
# value=spectral_mismatch_series,
components=results,
)
model_spectral_mismatch ¶
model_spectral_mismatch(
timestamps: DatetimeIndex,
timezone: ZoneInfo,
responsivity: Series,
irradiance: DataFrame | DataArray,
average_irradiance_density: (
None | DataFrame | DataArray
),
min_wavelength: float = MIN_WAVELENGTH,
max_wavelength: float = MAX_WAVELENGTH,
reference_spectrum: None | Series = None,
integrate_reference_spectrum: bool = False,
spectral_mismatch_model: SpectralMismatchModel = pvlib,
resample_large_series: bool = False,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
)
Source code in pvgisprototype/api/spectrum/spectral_mismatch.py
@log_function_call
def model_spectral_mismatch(
# longitude: Longitude,
# latitude: Latitude,
# elevation: Elevation,
timestamps: DatetimeIndex,
timezone: ZoneInfo,
responsivity: Series,
irradiance: DataFrame | DataArray,
average_irradiance_density: None | DataFrame | DataArray,
# neighbor_lookup: MethodForInexactMatches = MethodForInexactMatches.nearest,
# tolerance: None | float = TOLERANCE_DEFAULT,
# mask_and_scale: bool = False,
# in_memory: bool = False,
min_wavelength: float = MIN_WAVELENGTH,
max_wavelength: float = MAX_WAVELENGTH,
reference_spectrum: None | Series = None, # AM15G_IEC60904_3_ED4,
integrate_reference_spectrum: bool = False,
spectral_mismatch_model: SpectralMismatchModel = SpectralMismatchModel.pvlib,
# dtype: Annotated[str, typer_option_dtype] = DATA_TYPE_DEFAULT,
# array_backend: Annotated[str, typer_option_array_backend] = ARRAY_BACKEND_DEFAULT,
# multi_thread: Annotated[
# bool, typer_option_multi_thread
# ] = MULTI_THREAD_FLAG_DEFAULT,
resample_large_series: bool = False,
# dtype: str = DATA_TYPE_DEFAULT,
# array_backend: str = ARRAY_BACKEND_DEFAULT,
# multi_thread=multi_thread,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
):
"""
"""
spectral_mismatch = None
if spectral_mismatch_model.value == SpectralMismatchModel.pvlib:
pass
# average_irradiance_density_without_geographic_coordinates = (
# average_irradiance_density.drop_vars(
# ["longitude", "latitude"], errors="ignore"
# )
# )
# spectral_mismatch = calc_spectral_mismatch_field(
# sr=responsivity.to_dataframe().T, #T, # for one PV module technology
# e_sun=average_irradiance_density_without_geographic_coordinates.to_dataframe().T,
# e_ref=reference_spectrum,
# ).to_numpy() # Very Important !
if spectral_mismatch_model.value == SpectralMismatchModel.pelland:
spectral_mismatch = calculate_spectral_mismatch_pelland(
irradiance=irradiance,
responsivity=responsivity.T,
reference_spectrum=reference_spectrum,
)
return spectral_mismatch
statistics ¶
Modules:
| Name | Description |
|---|---|
pandas | |
polars | |
xarray | |
pandas ¶
Functions:
| Name | Description |
|---|---|
calculate_mean_of_series_per_time_unit | |
calculate_statistics | Calculate the sum, mean, standard deviation of a series based on a |
calculate_sum_and_percentage | Calculate sum of a series and its percentage relative to a reference series. |
calculate_mean_of_series_per_time_unit ¶
calculate_mean_of_series_per_time_unit(
series: ndarray,
timestamps: DatetimeIndex,
frequency: str,
)
Source code in pvgisprototype/api/statistics/pandas.py
def calculate_mean_of_series_per_time_unit(
series: numpy.ndarray,
timestamps: DatetimeIndex,
frequency: str,
):
""" """
# from devtools import debug
# debug(locals())
if frequency == "Single" or len(timestamps) == 1:
return series.mean().item() # Direct mean for a single value
pandas_series = pandas.Series(series, index=timestamps)
mean = pandas_series.resample(frequency).sum().mean().item() # convert to float
return mean
calculate_statistics ¶
calculate_statistics(
series,
timestamps,
frequency,
reference_series,
rounding_places=None,
dtype=DATA_TYPE_DEFAULT,
array_backend=ARRAY_BACKEND_DEFAULT,
)
Calculate the sum, mean, standard deviation of a series based on a specified frequency and its percentage relative to a reference series.
Source code in pvgisprototype/api/statistics/pandas.py
def calculate_statistics(
series,
timestamps,
frequency,
reference_series,
rounding_places=None,
dtype=DATA_TYPE_DEFAULT,
array_backend=ARRAY_BACKEND_DEFAULT,
):
"""Calculate the sum, mean, standard deviation of a series based on a
specified frequency and its percentage relative to a reference series.
"""
if frequency == "Single":
total = series.item() # total is the single value in the series
mean = total # For a single value, the mean is the value itself
std_dev = 0 # Standard deviation is 0 for a single value
percentage = (total / reference_series * 100) if reference_series != 0 else 0
if rounding_places is not None:
total = round_float_values(total, rounding_places)
percentage = round_float_values(percentage, rounding_places)
return total, mean, std_dev, percentage
pandas_series = pandas.Series(series, timestamps)
resampled = pandas_series.resample(frequency)
total = resampled.sum().sum().item() # convert to Python float
# if isinstance(total, numpy.ndarray):
# total = total.astype(dtype)
percentage = (total / reference_series * 100) if reference_series != 0 else 0
# if isinstance(percentage, numpy.ndarray):
# percentage.astype(dtype)
if rounding_places is not None:
total = round_float_values(total, rounding_places)
percentage = round_float_values(percentage, rounding_places)
mean = resampled.mean().mean().item() # convert to Python float
std_dev = resampled.std().mean() # Mean of standard deviations over the period
return total, mean, std_dev, percentage
calculate_sum_and_percentage ¶
calculate_sum_and_percentage(
series,
reference_series,
rounding_places=None,
dtype=DATA_TYPE_DEFAULT,
array_backend=ARRAY_BACKEND_DEFAULT,
)
Calculate sum of a series and its percentage relative to a reference series.
Notes
Uses .item() to convert NumPy numerics to standard Python types.
Source code in pvgisprototype/api/statistics/pandas.py
def calculate_sum_and_percentage(
series,
reference_series,
rounding_places=None,
dtype=DATA_TYPE_DEFAULT,
array_backend=ARRAY_BACKEND_DEFAULT,
):
"""Calculate sum of a series and its percentage relative to a reference series.
Notes
-----
Uses .item() to convert NumPy numerics to standard Python types.
"""
total = numpy.nansum(series).item()
if isinstance(total, numpy.ndarray):
total = total.astype(dtype).item()
percentage = (total / reference_series * 100) if reference_series != 0 else 0
if isinstance(percentage, numpy.ndarray):
percentage.astype(dtype)
if rounding_places is not None:
total = round_float_values(total, rounding_places)
percentage = round_float_values(percentage, rounding_places)
return total, percentage
polars ¶
Functions:
| Name | Description |
|---|---|
calculate_mean_of_series_per_time_unit | Calculate the mean of a series resampled to a specified time frequency using Polars. |
calculate_statistics | Calculate the descriptive statistics for a series based on a specified |
calculate_sum_and_percentage | Calculate sum of a series and its percentage relative to a reference series. |
get_season | Map each timestamp to a season |
calculate_mean_of_series_per_time_unit ¶
calculate_mean_of_series_per_time_unit(
series: ndarray,
timestamps: DatetimeIndex,
frequency: str,
) -> ScalarType
Calculate the mean of a series resampled to a specified time frequency using Polars.
Source code in pvgisprototype/api/statistics/polars.py
def calculate_mean_of_series_per_time_unit(
series: numpy.ndarray,
timestamps: DatetimeIndex,
frequency: str,
) -> numpy.ScalarType:
"""Calculate the mean of a series resampled to a specified time frequency using Polars."""
logger.debug(
f"The series input {series} is of type {type(series)}.",
alt=f"The series input {series} is of type {type(series)}.",
)
if numpy.isscalar(
series
): # NOTE in case of single value (scalar) the mean is it self
return series
# Handle the case for a single timestamp or "Single" frequency
if frequency == "Single" or len(timestamps) == 1:
logger.debug(
f"The requested frequency is {frequency} or the input DatetimeIndex is a single timestamp.",
alt=f"The requested frequency is [code]{frequency}[/code] or the DatetimeIndex is a single timestamp.",
)
return series.mean().item() # Direct mean for a single value
# Create a Polars DataFrame with series values and timestamps
data = polars.DataFrame(
{"timestamps": polars.Series(timestamps), "values": polars.Series(series)}
)
# Convert Pandas to Polars frequency strings
polars_frequency = FREQUENCY_PANDAS_TO_POLARS.get(frequency, frequency)
# Resample data using Polars' dynamic grouping
resampled_sum = (
data.sort("timestamps")
.group_by_dynamic("timestamps", every=polars_frequency)
.agg(polars.col("values").fill_nan(None).sum())
) # Sum within each time unit
# Compute the mean of the summed values
return resampled_sum["values"].mean()
calculate_statistics ¶
calculate_statistics(
series,
timestamps,
frequency,
reference_series,
rounding_places=None,
dtype=DATA_TYPE_DEFAULT,
array_backend=ARRAY_BACKEND_DEFAULT,
)
Calculate the descriptive statistics for a series based on a specified frequency and its percentage relative to a reference series.
Calculate the sum, mean and standard deviation of a series based on a frequency and its percentage relative to a reference series.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
series | ndarray | The input series. | required |
timestamps | ndarray | The timestamps associated with the series. | required |
frequency | str | The frequency of the series (e.g., "S" for seasonal). | required |
reference_series | ndarray | The reference series. | required |
rounding_places | Optional[int] | The number of decimal places to round the results to. Defaults to None. | None |
dtype | str | The data type of the results. Defaults to np.float64. | DATA_TYPE_DEFAULT |
array_backend | str | The array backend to use. Defaults to "numpy". | ARRAY_BACKEND_DEFAULT |
Returns:
| Type | Description |
|---|---|
tuple | A tuple containing the sum, mean, standard deviation, and percentage of the series relative to the reference series. |
See Also
numpy.sum, numpy.mean, numpy.std
Notes
This function uses Polars DataFrames for efficient grouping and aggregation.
Examples:
Calculate statistics for a seasonal series:
>>> from numpy import array
>>> from pandas import date_range
>>> series = array([1, 2, 3, 4, 5, 6, 7, 8, 9])
>>> timestamps = date_range(start='2022-01-01', end='2022-01-02', freq='3h')
>>> frequency = "D"
>>> reference_series = array([4])
>>> calculate_statistics(series, timestamps, frequency, reference_series)
(45.0, 6.75, 2.4494898319244385, 1125.0)
Raises:
| Type | Description |
|---|---|
ValueError | If series or reference_series is None. |
Source code in pvgisprototype/api/statistics/polars.py
@log_function_call
def calculate_statistics(
series,
timestamps,
frequency,
reference_series,
rounding_places=None,
dtype=DATA_TYPE_DEFAULT,
array_backend=ARRAY_BACKEND_DEFAULT,
):
"""Calculate the descriptive statistics for a series based on a specified
frequency and its percentage relative to a reference series.
Calculate the sum, mean and standard deviation of a series based on a
frequency and its percentage relative to a reference series.
Parameters
----------
series : np.ndarray
The input series.
timestamps : np.ndarray
The timestamps associated with the series.
frequency : str
The frequency of the series (e.g., "S" for seasonal).
reference_series : np.ndarray
The reference series.
rounding_places : Optional[int], optional
The number of decimal places to round the results to. Defaults to None.
dtype : str, optional
The data type of the results. Defaults to np.float64.
array_backend : str, optional
The array backend to use. Defaults to "numpy".
Returns
-------
tuple
A tuple containing the sum, mean, standard deviation, and percentage of
the series relative to the reference series.
See Also
--------
numpy.sum, numpy.mean, numpy.std
Notes
-----
This function uses Polars DataFrames for efficient grouping and aggregation.
Examples
--------
Calculate statistics for a seasonal series:
>>> from numpy import array
>>> from pandas import date_range
>>> series = array([1, 2, 3, 4, 5, 6, 7, 8, 9])
>>> timestamps = date_range(start='2022-01-01', end='2022-01-02', freq='3h')
>>> frequency = "D"
>>> reference_series = array([4])
>>> calculate_statistics(series, timestamps, frequency, reference_series)
(45.0, 6.75, 2.4494898319244385, 1125.0)
Raises
------
ValueError
If series or reference_series is None.
"""
logger.debug("Calculate statistics")
# Ensure initial inputs are in the specified dtype
logger.debug(
f"The input series {series} of shape {series.shape} is of type {type(series)} while the requested type is {dtype}.",
alt=f"The input series {series} of shape {series.shape} is of type {type(series)} while the requested type is {dtype}.",
)
series = numpy.asarray(series, dtype=dtype) if series.dtype != dtype else series
reference_series = (
numpy.asarray(reference_series, dtype=dtype)
if not isinstance(reference_series, numpy.generic)
or reference_series.dtype != dtype
else reference_series
)
if frequency == "Single":
logger.debug(
f"The requested frequency is {frequency}.",
alt=f"The requested frequency is [code]{frequency}[/code].",
)
# total = series.sum()
total = numpy.nansum(series, dtype=dtype)
mean = total
std_dev = numpy.array(
0, dtype=dtype
) # zero standard deviation for single values
percentage = (total / reference_series * 100) if reference_series != 0 else 0
if rounding_places is not None:
total = round(total, rounding_places)
percentage = round(percentage, rounding_places)
return total, mean, std_dev, percentage
# Convert timestamps and series to Polars DataFrame
data = polars.DataFrame(
{
"timestamps": polars.Series(timestamps),
"values": polars.Series(series, dtype=getattr(polars, dtype.capitalize())),
}
)
# Seasonal grouping
if frequency == "S":
logger.debug(
f"The requested frequency is {frequency} meaning seasonal.",
alt=f"The requested frequency is {frequency} meaning [italic]seasonal[/italic].",
)
# Add a season column based on month
data = data.with_columns(
polars.col("timestamps").dt.month().apply(get_season).alias("season")
)
# Group by season
resampled = data.group_by("season").agg(
[
polars.col("values").fill_nan(None).sum().alias("total"),
polars.col("values").fill_nan(None).mean().alias("mean"),
polars.col("values").fill_nan(None).std().alias("std_dev"),
]
)
# Non-seasonal grouping
else:
# Convert Pandas to Polars frequency strings
polars_frequency = FREQUENCY_PANDAS_TO_POLARS.get(frequency, frequency)
logger.debug(
f"The requested frequency is {frequency} (Polars : {polars_frequency}).",
alt=f"The requested frequency is [code]{frequency}[/code] (Polars : {polars_frequency}).",
)
resampled = (
data.sort("timestamps")
.group_by_dynamic("timestamps", every=polars_frequency)
.agg(
[
polars.col("values").fill_nan(None).sum().alias("total"),
polars.col("values").fill_nan(None).mean().alias("mean"),
polars.col("values").fill_nan(None).std().alias("std_dev"),
]
)
)
# Calculate sum, mean, std_dev over all resampled intervals _and_ cast to dtype
total = numpy.array(resampled["total"].sum(), dtype=dtype).item()
mean = numpy.array(resampled["mean"].mean(), dtype=dtype).item()
std_dev = numpy.array(resampled["std_dev"].mean(), dtype=dtype).item()
percentage = (
numpy.array((total / reference_series * 100), dtype=dtype).item()
if reference_series != 0
else numpy.array(0, dtype=dtype).item()
)
# Apply rounding if needed
if rounding_places is not None:
logger.debug(
f"Rounding values total : {total}, mean : {mean}, std_dev : {std_dev} and percentage : {percentage}",
alt=f"Rounding values total : {total}, mean : {mean}, std_dev : {std_dev} and percentage : {percentage}",
)
total = round_float_values(total, rounding_places)
mean = round_float_values(mean, rounding_places)
std_dev = round_float_values(std_dev, rounding_places)
percentage = round_float_values(percentage, rounding_places)
# Return values as scalars if they are single elements, otherwise as arrays
total = total if numpy.isscalar(total) else numpy.array(total, dtype=dtype)
mean = mean if numpy.isscalar(mean) else numpy.array(mean, dtype=dtype)
std_dev = std_dev if numpy.isscalar(std_dev) else numpy.array(std_dev, dtype=dtype)
percentage = (
percentage
if numpy.isscalar(percentage)
else numpy.array(percentage, dtype=dtype)
)
return total, mean, std_dev, percentage
calculate_sum_and_percentage ¶
calculate_sum_and_percentage(
series,
reference_series,
rounding_places=None,
dtype=DATA_TYPE_DEFAULT,
array_backend=ARRAY_BACKEND_DEFAULT,
)
Calculate sum of a series and its percentage relative to a reference series.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
series | array - like | The input series to sum over. | required |
reference_series | float or int | Reference value for calculating percentage. | required |
rounding_places | int | Number of decimal places for rounding the result. | None |
dtype | str | Data type for the calculations, default is "float32". | DATA_TYPE_DEFAULT |
Returns:
| Type | Description |
|---|---|
tuple | The total sum and percentage relative to the reference. |
Source code in pvgisprototype/api/statistics/polars.py
@log_function_call
def calculate_sum_and_percentage(
series,
reference_series,
rounding_places=None,
dtype=DATA_TYPE_DEFAULT,
array_backend=ARRAY_BACKEND_DEFAULT,
):
"""Calculate sum of a series and its percentage relative to a reference series.
Parameters
----------
series : array-like
The input series to sum over.
reference_series : float or int
Reference value for calculating percentage.
rounding_places : int, optional
Number of decimal places for rounding the result.
dtype : str, optional
Data type for the calculations, default is "float32".
Returns
-------
tuple
The total sum and percentage relative to the reference.
"""
total = numpy.nansum(series, dtype=dtype)
percentage = (total / reference_series * 100) if reference_series != 0 else 0
if rounding_places is not None:
total = round_float_values(total, rounding_places)
percentage = round_float_values(percentage, rounding_places)
return total, percentage
get_season ¶
Map each timestamp to a season
Source code in pvgisprototype/api/statistics/polars.py
@log_function_call
def get_season(month):
"""
Map each timestamp to a season
"""
if month in [12, 1, 2]: # December-January-February
return "DJF"
elif month in [3, 4, 5]: # March-April-May
return "MAM"
elif month in [6, 7, 8]: # June-July-August
return "JJA"
elif month in [9, 10, 11]: # September-October-November
return "SON"
xarray ¶
Functions:
| Name | Description |
|---|---|
calculate_series_statistics | |
calculate_spectral_factor_statistics | Calculate statistics for the spectral factor data. |
generate_series_statistics | |
group_series_statistics | |
calculate_series_statistics ¶
calculate_series_statistics(
data_array: ndarray | Dict[str, ndarray],
timestamps: DatetimeIndex,
groupby: str | None = None,
) -> dict
Source code in pvgisprototype/api/statistics/xarray.py
def calculate_series_statistics(
data_array: numpy.ndarray | Dict[str, numpy.ndarray],
timestamps: DatetimeIndex,
groupby: str | None = None,
) -> dict:
""" """
irradiance_xarray = None # Ugly Hack :-/
if isinstance(data_array, dict):
# First, irradiance may exist only in a dictionary !
irradiance_xarray = data_array.get(GLOBAL_INCLINED_IRRADIANCE_COLUMN_NAME, None)
if irradiance_xarray is not None:
irradiance_xarray = DataArray(
irradiance_xarray,
coords=[("time", timestamps)],
name="Effective irradiance series",
)
irradiance_xarray.attrs["units"] = "W/m^2"
irradiance_xarray.attrs["long_name"] = "Effective Solar Irradiance"
irradiance_xarray.load()
# Then, the primary wanted data
data_array = data_array[PHOTOVOLTAIC_POWER_COLUMN_NAME]
# Regardless of whether the input data_array is an array or a dict :
data_xarray = DataArray(
data_array, coords=[("time", timestamps)], name="Effective irradiance series"
)
data_xarray.attrs["units"] = "W/m^2"
data_xarray.attrs["long_name"] = "Photovoltaic power"
data_xarray.load()
statistics = generate_series_statistics(data_xarray=data_xarray, groupby=groupby)
statistics = group_series_statistics(
data_xarray=data_xarray,
irradiance_xarray=irradiance_xarray,
statistics=statistics,
groupby=groupby,
)
return statistics
calculate_spectral_factor_statistics ¶
calculate_spectral_factor_statistics(
spectral_factor: Dict,
spectral_factor_model: List,
photovoltaic_module_type: List,
timestamps: DatetimeIndex,
rounding_places: int | None = 3,
groupby: str | None = None,
) -> dict
Calculate statistics for the spectral factor data.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
spectral_factor | Dict | Dictionary containing spectral factor data. | required |
spectral_factor_model | List | List of spectral factor models. | required |
photovoltaic_module_type | List | List of photovoltaic module types. | required |
timestamps | DatetimeIndex | Timestamps for the data series. | required |
rounding_places | int | Decimal places for rounding. | 3 |
groupby | str | Time grouping for statistics, e.g., 'Y', 'M', 'D', etc. | None |
Returns:
| Name | Type | Description |
|---|---|---|
statistics | dict | Dictionary with calculated statistics for each model and module type. |
Source code in pvgisprototype/api/statistics/xarray.py
def calculate_spectral_factor_statistics(
spectral_factor: Dict,
spectral_factor_model: List,
photovoltaic_module_type: List,
timestamps: DatetimeIndex,
rounding_places: int | None = 3,
groupby: str | None = None,
) -> dict:
"""
Calculate statistics for the spectral factor data.
Parameters
----------
spectral_factor : Dict
Dictionary containing spectral factor data.
spectral_factor_model : List
List of spectral factor models.
photovoltaic_module_type : List
List of photovoltaic module types.
timestamps : DatetimeIndex
Timestamps for the data series.
rounding_places : int
Decimal places for rounding.
groupby : str
Time grouping for statistics, e.g., 'Y', 'M', 'D', etc.
Returns
-------
statistics : dict
Dictionary with calculated statistics for each model and module type.
"""
statistics = {}
for model in spectral_factor_model:
statistics[model.value] = {}
for module_type in photovoltaic_module_type:
# Extract spectral factor data for the model and module type
spectral_factor_data = (
spectral_factor.get(model)
.get(module_type)
.get(SPECTRAL_FACTOR_COLUMN_NAME)
)
if spectral_factor_data is not None:
# Create an Xarray DataArray for the spectral factor data
spectral_factor_xarray = DataArray(
spectral_factor_data,
coords=[("time", timestamps)],
name=f"{module_type.value} Spectral Mismatch",
)
spectral_factor_xarray.attrs["units"] = "W/m^2"
spectral_factor_xarray.attrs["long_name"] = (
f"{module_type.value} Spectral Factor"
)
# Generate basic and extended statistics
module_statistics = generate_series_statistics(
data_xarray=spectral_factor_xarray,
groupby=groupby,
)
# Add time-grouped statistics (e.g., yearly, monthly) if requested
module_statistics = group_series_statistics(
data_xarray=spectral_factor_xarray,
irradiance_xarray=None,
statistics=module_statistics,
groupby=groupby,
)
# Store the statistics for this combination of model and module type
statistics[model.value][module_type.value] = module_statistics
return statistics
generate_series_statistics ¶
Source code in pvgisprototype/api/statistics/xarray.py
def generate_series_statistics(
data_xarray: DataArray,
groupby: str | None = None,
) -> dict:
""" """
statistics_container = {
"Basic": lambda: {
"Start": data_xarray.time.values[0],
"End": data_xarray.time.values[-1],
"Count": data_xarray.count().item(),
"Min": data_xarray.min().item(),
"Mean": data_xarray.mean().item(),
"Max": data_xarray.max().item(),
"Sum": data_xarray.sum().item(),
},
"Extended": lambda: {
"25th Percentile": numpy.percentile(data_xarray, 25),
"Median": data_xarray.median().item(),
"Mode": mode(data_xarray.values.flatten())[0],
"Variance": data_xarray.var().item(),
"Standard deviation": data_xarray.std().item(),
}, # if verbose > 1 else {},
"Timestamps": lambda: {
"Time of Min": data_xarray.idxmin("time").values,
"Index of Min": data_xarray.argmin().item(),
"Time of Max": data_xarray.idxmax("time").values,
"Index of Max": data_xarray.argmax().item(),
}, # if verbose > 2 else {},
"Coordinates": lambda: (
{
"Longitude of Max": data_xarray.argmax("lon").item(),
"Latitude of Max": data_xarray.argmax("lat").item(),
}
if "longitude" in data_xarray.dims and "latitude" in data_xarray.dims
else {}
),
}
statistics = {}
for _, func in statistics_container.items():
statistics.update(func())
return statistics
group_series_statistics ¶
group_series_statistics(
data_xarray: DataArray | None,
irradiance_xarray: DataArray | None,
statistics: dict,
groupby: str | None = None,
)
Source code in pvgisprototype/api/statistics/xarray.py
def group_series_statistics(
data_xarray: DataArray | None,
irradiance_xarray: DataArray | None,
statistics: dict,
groupby: str | None = None,
):
""" """
if groupby in TIME_GROUPINGS:
freq, label = TIME_GROUPINGS[groupby]
if groupby in ["Y", "M", "S"]:
statistics[label] = data_xarray.groupby(f"time.{freq}").mean().values
if irradiance_xarray is not None:
statistics[GLOBAL_INCLINED_IRRADIANCE_COLUMN_NAME] = (
irradiance_xarray.groupby(f"time.{freq}").mean().values
)
else:
statistics[label] = data_xarray.resample(time=freq).mean().values
statistics["Sum of Group Means"] = statistics[label].sum()
if irradiance_xarray is not None:
statistics[f"Sum of {GLOBAL_INCLINED_IRRADIANCE_COLUMN_NAME}"] = statistics[
GLOBAL_INCLINED_IRRADIANCE_COLUMN_NAME
].sum()
elif groupby: # custom frequencies like '3H', '2W', etc.
custom_label = f"{groupby} means"
statistics[custom_label] = data_xarray.resample(time=groupby).mean().values
statistics["Sum of Group Means"] = (
data_xarray.resample(time=groupby).mean().sum().values
)
return statistics
surface ¶
Modules:
| Name | Description |
|---|---|
optimizer_bounds | |
output | |
parameters | |
positioning | |
power | |
presentation_example | Example for a single day |
recommender | |
optimizer_bounds ¶
Functions:
| Name | Description |
|---|---|
define_optimiser_bounds | Define bounds for the optimisation process. |
define_optimiser_bounds ¶
define_optimiser_bounds(
min_surface_orientation: float,
max_surface_orientation: float,
min_surface_tilt: float,
max_surface_tilt: float,
mode: SurfacePositionOptimizerMode,
method: SurfacePositionOptimizerMethod,
verbose: int = VERBOSE_LEVEL_DEFAULT,
) -> tuple | Bounds
Define bounds for the optimisation process.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
min_surface_orientation | float | The minimum surface orientation allowed. | required |
max_surface_orientation | float | The maximum surface orientation allowed. | required |
min_surface_tilt | float | The minimum surface tilt allowed. | required |
max_surface_tilt | float | The maximum surface tilt allowed. | required |
mode | SurfacePositionOptimizerMode | The optimisation mode. | required |
method | SurfacePositionOptimizerMethod | The optimisation method. | required |
verbose | int | The verbosity level. Defaults to VERBOSE_LEVEL_DEFAULT. | VERBOSE_LEVEL_DEFAULT |
Returns:
| Type | Description |
|---|---|
tuple | Bounds | The bounds for the optimisation process. |
Notes
The bounds are defined as follows:
- For the SurfacePositionOptimizerMode.Orientation_and_Tilt mode, the bounds are defined as a tuple of two slices.
- For the SurfacePositionOptimizerMode.Tilt mode, the bounds are defined as a Bounds object with the lower and upper bounds set to the minimum and maximum surface tilt respectively.
- For the SurfacePositionOptimizerMode.Orientation mode, the bounds are defined as a Bounds object with the lower and upper bounds set to the minimum and maximum surface orientation respectively.
If the method is SurfacePositionOptimizerMethod.brute, the bounds are returned as a tuple of two slices. Otherwise, the bounds are returned as a Bounds object.
Source code in pvgisprototype/api/surface/optimizer_bounds.py
def define_optimiser_bounds(
min_surface_orientation: float,
max_surface_orientation: float,
min_surface_tilt: float,
max_surface_tilt: float,
mode: SurfacePositionOptimizerMode,
method: SurfacePositionOptimizerMethod,
verbose: int = VERBOSE_LEVEL_DEFAULT,
) -> tuple | Bounds:
"""
Define bounds for the optimisation process.
Parameters
----------
min_surface_orientation: float
The minimum surface orientation allowed.
max_surface_orientation: float
The maximum surface orientation allowed.
min_surface_tilt: float
The minimum surface tilt allowed.
max_surface_tilt: float
The maximum surface tilt allowed.
mode: SurfacePositionOptimizerMode
The optimisation mode.
method: SurfacePositionOptimizerMethod
The optimisation method.
verbose: int, optional
The verbosity level. Defaults to VERBOSE_LEVEL_DEFAULT.
Returns
-------
tuple | Bounds
The bounds for the optimisation process.
Notes
-----
The bounds are defined as follows:
- For the SurfacePositionOptimizerMode.Orientation_and_Tilt mode, the bounds are defined as a tuple of two slices.
- For the SurfacePositionOptimizerMode.Tilt mode, the bounds are defined as a Bounds object with the lower and upper
bounds set to the minimum and maximum surface tilt respectively.
- For the SurfacePositionOptimizerMode.Orientation mode, the bounds are defined as a Bounds object with the lower and
upper bounds set to the minimum and maximum surface orientation respectively.
If the method is SurfacePositionOptimizerMethod.brute, the bounds are returned as a tuple of two slices. Otherwise,
the bounds are returned as a Bounds object.
"""
brute_force_precision = radians(1)
surface_orientation_range = slice(
min_surface_orientation, max_surface_orientation, brute_force_precision
)
surface_tilt_range = slice(
min_surface_tilt, max_surface_tilt, brute_force_precision
)
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
f"i Define bounds for the '{method}' optimiser ..",
alt=f"i [bold]Define[/bold] bounds for the [magenta]{method}[/magenta] optimiser ..",
)
if method == SurfacePositionOptimizerMethod.brute:
return (
(surface_orientation_range, surface_tilt_range)
if mode == SurfacePositionOptimizerMode.Orientation_and_Tilt
else (
(
surface_tilt_range
if mode == SurfacePositionOptimizerMode.Tilt
else surface_orientation_range
),
)
)
if mode == SurfacePositionOptimizerMode.Tilt:
return Bounds(lb=surface_tilt_range.start, ub=surface_tilt_range.stop)
if mode == SurfacePositionOptimizerMode.Orientation:
return Bounds(
lb=surface_orientation_range.start, ub=surface_orientation_range.stop
)
if mode == SurfacePositionOptimizerMode.Orientation_and_Tilt:
return Bounds(
lb=[surface_orientation_range.start, surface_tilt_range.start],
ub=[surface_orientation_range.stop, surface_tilt_range.stop],
)
raise ValueError("Invalid mode provided.")
output ¶
Functions:
| Name | Description |
|---|---|
build_optimiser_output | Build the output dictionary for the surface position optimisation. |
build_optimiser_output ¶
build_optimiser_output(
optimiser_output: OptimizeResult | ndarray,
objective_function_arguments: dict,
mode: SurfacePositionOptimizerMode,
method: SurfacePositionOptimizerMethod,
surface_orientation: (
SurfaceOrientation | None
) = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt | None = SURFACE_TILT_DEFAULT,
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
angle_output_units: str = RADIANS,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Build the output dictionary for the surface position optimisation.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
optimiser_output | OptimizeResult | ndarray | The output of the optimiser. | required |
objective_function_arguments | dict | The arguments passed to the optimiser. | required |
mode | SurfacePositionOptimizerMode | The mode of the optimisation. | required |
method | SurfacePositionOptimizerMethod | The method used for the optimisation. | required |
surface_orientation | SurfaceOrientation | None | The surface orientation. If None, the default value is used. | SURFACE_ORIENTATION_DEFAULT |
surface_tilt | SurfaceTilt | None | The surface tilt. If None, the default value is used. | SURFACE_TILT_DEFAULT |
solar_time_model | SolarTimeModel | The solar time model used for the optimisation. | SOLAR_TIME_ALGORITHM_DEFAULT |
angle_output_units | str | The units of the angle output. Default is radians. | RADIANS |
verbose | int | The verbosity level. Default is 0. | VERBOSE_LEVEL_DEFAULT |
Returns:
| Name | Type | Description |
|---|---|---|
optimal_surface_position | dict | A dictionary containing the optimal surface position, the mean photovoltaic power output and the units of the angle output. |
Source code in pvgisprototype/api/surface/output.py
def build_optimiser_output(
optimiser_output: OptimizeResult | ndarray,
objective_function_arguments: dict,
mode: SurfacePositionOptimizerMode,
method: SurfacePositionOptimizerMethod,
surface_orientation: SurfaceOrientation | None = SURFACE_ORIENTATION_DEFAULT,
surface_tilt: SurfaceTilt | None = SURFACE_TILT_DEFAULT,
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
angle_output_units: str = RADIANS,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
"""
Build the output dictionary for the surface position optimisation.
Parameters
----------
optimiser_output : OptimizeResult | ndarray
The output of the optimiser.
objective_function_arguments : dict
The arguments passed to the optimiser.
mode : SurfacePositionOptimizerMode
The mode of the optimisation.
method : SurfacePositionOptimizerMethod
The method used for the optimisation.
surface_orientation : SurfaceOrientation | None
The surface orientation. If None, the default value is used.
surface_tilt : SurfaceTilt | None
The surface tilt. If None, the default value is used.
solar_time_model : SolarTimeModel
The solar time model used for the optimisation.
angle_output_units : str
The units of the angle output. Default is radians.
verbose : int
The verbosity level. Default is 0.
Returns
-------
optimal_surface_position : dict
A dictionary containing the optimal surface position, the mean photovoltaic
power output and the units of the angle output.
"""
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
f"i Build the output dictionary",
alt=f"i [bold]Build[/bold] the [magenta]output dictionary[/magenta]",
)
optimal_surface_position = {
SURFACE_ORIENTATION_NAME: None,
SURFACE_TILT_NAME: None,
MEAN_PHOTOVOLTAIC_POWER_NAME: None,
UNITS_COLUMN_NAME: angle_output_units,
TIME_ALGORITHM_NAME: solar_time_model.value,
}
_optimal_surface_position = OptimalSurfacePosition(
angle_output_units=angle_output_units,
solar_timing_algorithm=solar_time_model.value,
)
if mode == SurfacePositionOptimizerMode.Tilt:
if not isinstance(
surface_orientation, SurfaceOrientation
):
surface_orientation = SurfaceOrientation(
value=surface_orientation,
unit=RADIANS,
)
surface_orientation = SurfaceOrientation(
value=convert_float_to_degrees_if_requested(
surface_orientation.value, angle_output_units
),
unit=angle_output_units,
)
optimal_surface_position[SURFACE_ORIENTATION_NAME] = surface_orientation
_optimal_surface_position.surface_orientation = surface_orientation
if method == SurfacePositionOptimizerMethod.brute:
optimal_surface_tilt = SurfaceTilt(
value=convert_float_to_degrees_if_requested(
optimiser_output, angle_output_units # type: ignore[arg-type]
),
unit=angle_output_units,
optimal=True,
optimizer=method,
)
optimal_surface_position[SURFACE_TILT_NAME] = optimal_surface_tilt
_optimal_surface_position.surface_tilt = optimal_surface_tilt
# NOTE Make last call to get full results as from API
# NOTE No expected execution time cost since results are cached
# NOTE Implement something like the snippet below
# --------------------------------------------------->
# arguments["surface_tilt"] = optimal_position[SURFACE_TILT_NAME].radians
# arguments["surface_orientation"]=optimal_position[SURFACE_ORIENTATION_NAME].radians
# photovoltaic_power_series = calculate_photovoltaic_power_output_series(
# **arguments,
# )
# ----------------------------------------------------
mean_photovoltaic_power = (
-calculate_mean_negative_photovoltaic_power_output(
surface_angle=optimiser_output,
objective_function_arguments=objective_function_arguments,
mode=mode,
)
)
optimal_surface_position[MEAN_PHOTOVOLTAIC_POWER_NAME] = mean_photovoltaic_power
_optimal_surface_position.mean_photovoltaic_power = mean_photovoltaic_power
# from devtools import debug
# debug(optimal_surface_position)
# debug(_optimal_surface_position)
return optimal_surface_position, _optimal_surface_position
if optimiser_output.success: # type: ignore[union-attr]
optimal_surface_tilt = SurfaceTilt(
value=convert_float_to_degrees_if_requested(
optimiser_output.x[0], angle_output_units # type: ignore[union-attr]
),
unit=angle_output_units,
optimal=True,
optimizer=method,
)
optimal_surface_position[SURFACE_TILT_NAME] = optimal_surface_tilt
_optimal_surface_position.surface_tilt = optimal_surface_tilt
# NOTE Make last call to get full results as from API
# NOTE No expected execution time cost since results are cached
# NOTE Implement something like the snippet below
# --------------------------------------------------->
objective_function_arguments["surface_orientation"] = _optimal_surface_position.surface_orientation.radians
objective_function_arguments["surface_tilt"] = _optimal_surface_position.surface_tilt.radians
photovoltaic_power_series = calculate_photovoltaic_power_output_series(
**objective_function_arguments,
)
# ----------------------------------------------------
mean_photovoltaic_power = (
-optimiser_output.fun # type: ignore[union-attr]
)
optimal_surface_position[MEAN_PHOTOVOLTAIC_POWER_NAME] = mean_photovoltaic_power
_optimal_surface_position.photovoltaic_power = photovoltaic_power_series
_optimal_surface_position.mean_photovoltaic_power = mean_photovoltaic_power
# from devtools import debug
# debug(optimal_surface_position)
# debug(_optimal_surface_position)
return optimal_surface_position, _optimal_surface_position
if mode == SurfacePositionOptimizerMode.Orientation:
if not isinstance(
surface_tilt, SurfaceTilt
):
surface_tilt = SurfaceTilt(
value=surface_tilt,
unit=RADIANS,
)
surface_tilt = SurfaceTilt(
value=convert_float_to_degrees_if_requested(
surface_tilt.value, angle_output_units
),
unit=angle_output_units,
)
optimal_surface_position[SURFACE_TILT_NAME] = surface_tilt
_optimal_surface_position.surface_tilt = surface_tilt
if method == SurfacePositionOptimizerMethod.brute:
surface_orientation = SurfaceOrientation(
value=convert_float_to_degrees_if_requested(
optimiser_output, angle_output_units # type: ignore[arg-type]
),
unit=angle_output_units,
optimal=True,
optimizer=method,
)
optimal_surface_position[SURFACE_ORIENTATION_NAME] = surface_orientation
_optimal_surface_position = surface_orientation
# NOTE Make last call to get full results as from API
# NOTE No expected execution time cost since results are cached
# NOTE Implement something like the snippet below
# --------------------------------------------------->
# arguments["surface_tilt"] = optimal_position[SURFACE_TILT_NAME].radians
# arguments["surface_orientation"]=optimal_position[SURFACE_ORIENTATION_NAME].radians
# photovoltaic_power_series = calculate_photovoltaic_power_output_series(
# **arguments,
# )
# ----------------------------------------------------
mean_photovoltaic_power = (
-calculate_mean_negative_photovoltaic_power_output(
surface_angle=optimiser_output,
objective_function_arguments=objective_function_arguments,
mode=mode,
)
)
optimal_surface_position[MEAN_PHOTOVOLTAIC_POWER_NAME] = mean_photovoltaic_power
_optimal_surface_position.mean_photovoltaic_power = mean_photovoltaic_power
elif optimiser_output.success: # type: ignore[union-attr]
surface_orientation = SurfaceOrientation(
value=convert_float_to_degrees_if_requested(
optimiser_output.x[0], angle_output_units # type: ignore[union-attr]
),
unit=angle_output_units,
optimal=True,
optimizer=method,
)
optimal_surface_position[SURFACE_ORIENTATION_NAME] = surface_orientation
_optimal_surface_position.surface_orientation = surface_orientation
# NOTE Make last call to get full results as from API
# NOTE No expected execution time cost since results are cached
# NOTE Implement something like the snippet below
# --------------------------------------------------->
# arguments["surface_tilt"] = optimal_position[SURFACE_TILT_NAME].radians
# arguments["surface_orientation"]=optimal_position[SURFACE_ORIENTATION_NAME].radians
# photovoltaic_power_series = calculate_photovoltaic_power_output_series(
# **arguments,
# )
# ----------------------------------------------------
mean_photovoltaic_power = (
-optimiser_output.fun # type: ignore[union-attr]
)
optimal_surface_position[MEAN_PHOTOVOLTAIC_POWER_NAME] = mean_photovoltaic_power
_optimal_surface_position.mean_photovoltaic_power = mean_photovoltaic_power
if mode == SurfacePositionOptimizerMode.Orientation_and_Tilt:
if method == SurfacePositionOptimizerMethod.brute:
optimal_surface_orientation = SurfaceOrientation(
value=convert_float_to_degrees_if_requested(
optimiser_output[0], angle_output_units
),
unit=angle_output_units,
optimal=True,
optimizer=method,
)
optimal_surface_position[SURFACE_ORIENTATION_NAME] = optimal_surface_orientation
optimal_surface_tilt = SurfaceTilt(
value=convert_float_to_degrees_if_requested(
optimiser_output[1], angle_output_units
),
unit=angle_output_units,
optimal=True,
optimizer=method,
)
optimal_surface_position[SURFACE_TILT_NAME] = optimal_surface_tilt
# NOTE Make last call to get full results as from API
# NOTE No expected execution time cost since results are cached
# NOTE Implement something like the snippet below
# --------------------------------------------------->
# arguments["surface_tilt"] = optimal_position[SURFACE_TILT_NAME].radians
# arguments["surface_orientation"]=optimal_position[SURFACE_ORIENTATION_NAME].radians
# photovoltaic_power_series = calculate_photovoltaic_power_output_series(
# **arguments,
# )
# ----------------------------------------------------
mean_photovoltaic_power = (
-calculate_mean_negative_photovoltaic_power_output(
surface_angle=optimiser_output,
objective_function_arguments=objective_function_arguments,
mode=mode,
)
)
optimal_surface_position[MEAN_PHOTOVOLTAIC_POWER_NAME] = mean_photovoltaic_power
_optimal_surface_position.mean_photovoltaic_power = mean_photovoltaic_power
elif optimiser_output.success: # type: ignore[union-attr]
optimal_surface_orientation = SurfaceOrientation(
value=convert_float_to_degrees_if_requested(
optimiser_output.x[0], angle_output_units # type: ignore[union-attr]
),
unit=angle_output_units,
optimal=True,
optimizer=method,
)
optimal_surface_tilt = optimal_surface_orientation
optimal_surface_position[SURFACE_TILT_NAME] = SurfaceTilt(
value=convert_float_to_degrees_if_requested(
optimiser_output.x[1], angle_output_units # type: ignore[union-attr]
),
unit=angle_output_units,
optimal=True,
optimizer=method,
)
optimal_surface_position[SURFACE_TILT_NAME] = optimal_surface_tilt
# NOTE Make last call to get full results as from API
# NOTE No expected execution time cost since results are cached
# NOTE Implement something like the snippet below
# --------------------------------------------------->
# arguments["surface_tilt"] = optimal_position[SURFACE_TILT_NAME].radians
# arguments["surface_orientation"]=optimal_position[SURFACE_ORIENTATION_NAME].radians
# photovoltaic_power_series = calculate_photovoltaic_power_output_series(
# **arguments,
# )
# ----------------------------------------------------
optimal_surface_position[MEAN_PHOTOVOLTAIC_POWER_NAME] = -optimiser_output.fun # type: ignore[union-attr]
_optimal_surface_position.mean_photovoltaic_power = -optimiser_output.fun
# from devtools import debug
# debug(optimal_surface_position)
# debug(_optimal_surface_position)
return optimal_surface_position, _optimal_surface_position
parameters ¶
Functions:
| Name | Description |
|---|---|
build_location_dictionary | Build a dictionary containing location parameters. |
build_other_input_arguments_dictionary | Build a dictionary of input arguments for the photovoltaic model. |
build_location_dictionary ¶
build_location_dictionary(
longitude: float,
latitude: float,
elevation: float,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Build a dictionary containing location parameters.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
longitude | float | The longitude of the location. | required |
latitude | float | The latitude of the location. | required |
elevation | float | The elevation of the location. | required |
verbose | int | The verbosity level. Defaults to VERBOSE_LEVEL_DEFAULT. | VERBOSE_LEVEL_DEFAULT |
Returns:
| Name | Type | Description |
|---|---|---|
location_arguments | dict | A dictionary containing the location arguments. |
Source code in pvgisprototype/api/surface/parameters.py
def build_location_dictionary(
longitude: float,
latitude: float,
elevation: float,
# timestamps: DatetimeIndex,
# timezone: ZoneInfo,
# surface_orientation: float,
# surface_tilt: float,
# mode: SurfacePositionOptimizerMode,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
"""
Build a dictionary containing location parameters.
Parameters
----------
longitude : float
The longitude of the location.
latitude : float
The latitude of the location.
elevation : float
The elevation of the location.
verbose : int, optional
The verbosity level. Defaults to VERBOSE_LEVEL_DEFAULT.
Returns
-------
location_arguments : dict
A dictionary containing the location arguments.
"""
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
f"i Collect location arguments",
alt=f"i [bold]Collect[/bold] the [magenta]location arguments[/magenta]",
)
location_arguments = {
"longitude": longitude,
"latitude": latitude,
"elevation": elevation,
}
return location_arguments
build_other_input_arguments_dictionary ¶
Build a dictionary of input arguments for the photovoltaic model.
This function collects all keyword arguments into a single dictionary and optionally logs a message if the verbosity level is high enough.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
verbose | int | Verbosity level used for logging. If greater than the configured threshold ( | VERBOSE_LEVEL_DEFAULT |
**kwargs | dict | Arbitrary keyword arguments representing input parameters. These include, but are not limited to:
| {} |
Returns:
| Type | Description |
|---|---|
dict | A dictionary containing all keyword arguments and the |
Notes
The function uses **kwargs to dynamically accept and return a flexible set of arguments.
Source code in pvgisprototype/api/surface/parameters.py
def build_other_input_arguments_dictionary(
verbose: int = VERBOSE_LEVEL_DEFAULT,
**kwargs,
) -> dict:
"""
Build a dictionary of input arguments for the photovoltaic model.
This function collects all keyword arguments into a single dictionary
and optionally logs a message if the verbosity level is high enough.
Parameters
----------
verbose : int, optional
Verbosity level used for logging. If greater than the configured
threshold (`HASH_AFTER_THIS_VERBOSITY_LEVEL`), a logging message is printed.
Default is 0.
**kwargs : dict
Arbitrary keyword arguments representing input parameters. These include,
but are not limited to:
- global_horizontal_irradiance : array
- direct_horizontal_irradiance : array
- spectral_factor_series : array | SpectralFactorSeries
- photovoltaic_module : PhotovoltaicModuleModel
- temperature_series : array | TemperatureSeries
- wind_speed_series : array | WindspeedSeries
- horizon_profile : DataArray | None
- shading_model : ShadingModel
- linke_turbidity_factor_series : array | LinkeTurbidityFactor
- shading_states : ShadingState
- adjust_for_atmospheric_refraction : bool
- refracted_solar_zenith : float | None
- albedo : float | None
- apply_reflectivity_factor : bool
- solar_position_model : SolarPositionModel
- sun_horizon_position : SunHorizonPositionModel
- solar_incidence_model : SolarIncidenceModel
- zero_negative_solar_incidence_angle : bool
- solar_time_model : SolarTimeModel
- solar_constant : float
- perigee_offset : float
- eccentricity_correction_factor : float
- peak_power : float | None
- system_efficiency : float | None
- power_model : PhotovoltaicModulePerformanceModel
- temperature_model : ModuleTemperatureAlgorithm
- efficiency : float | None
Returns
-------
dict
A dictionary containing all keyword arguments and the `verbose` flag.
Notes
-----
The function uses `**kwargs` to dynamically accept and return a flexible set
of arguments.
"""
if verbose > HASH_AFTER_THIS_VERBOSITY_LEVEL:
logger.debug(
"i Collect the rest input arguments",
alt="i [bold]Collect[/bold] the [magenta]rest input arguments[/magenta]",
)
return dict(kwargs, verbose=verbose)
positioning ¶
Functions:
| Name | Description |
|---|---|
build_surface_position_optimisation_mode | |
optimise_surface_position | This function optimizes the position of a surface. |
build_surface_position_optimisation_mode ¶
build_surface_position_optimisation_mode(
surface_orientation,
surface_tilt,
mode: SurfacePositionOptimizerMode,
)
Source code in pvgisprototype/api/surface/positioning.py
def build_surface_position_optimisation_mode(
surface_orientation,
surface_tilt,
mode: SurfacePositionOptimizerMode
):
"""
"""
surface_position_arguments = {}
if mode == SurfacePositionOptimizerMode.Tilt:
surface_position_arguments = {
'surface_orientation': surface_orientation
}
if mode == SurfacePositionOptimizerMode.Orientation:
surface_position_arguments["surface_tilt"] = surface_tilt
return surface_position_arguments
optimise_surface_position ¶
optimise_surface_position(
longitude: Longitude,
latitude: Latitude,
elevation: float,
surface_orientation: SurfaceOrientation = SurfaceOrientation(
value=radians(180), unit="radians"
),
surface_tilt: SurfaceTilt = SurfaceTilt(
value=radians(45), unit="radians"
),
min_surface_orientation: float = min_radians,
max_surface_orientation: float = max_radians,
min_surface_tilt: float = min_radians,
max_surface_tilt: float = max_radians,
timestamps: DatetimeIndex = DatetimeIndex(
[now(tz="UTC")]
),
timezone: ZoneInfo = ZoneInfo("UTC"),
global_horizontal_irradiance: ndarray | None = None,
direct_horizontal_irradiance: ndarray | None = None,
spectral_factor_series: SpectralFactorSeries = SpectralFactorSeries(
value=SPECTRAL_FACTOR_DEFAULT
),
temperature_series: TemperatureSeries = TemperatureSeries(
value=TEMPERATURE_DEFAULT
),
wind_speed_series: WindSpeedSeries = WindSpeedSeries(
value=WIND_SPEED_DEFAULT
),
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = pvgis,
shading_states: List[ShadingState] = [all],
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
refracted_solar_zenith: (
float | None
) = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
albedo: float | None = ALBEDO_DEFAULT,
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
sun_horizon_position: List[
SunHorizonPositionModel
] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_model: SolarIncidenceModel = iqbal,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
photovoltaic_module: PhotovoltaicModuleModel = CSI_FREE_STANDING,
peak_power: float = PEAK_POWER_DEFAULT,
system_efficiency: (
float | None
) = SYSTEM_EFFICIENCY_DEFAULT,
power_model: PhotovoltaicModulePerformanceModel = king,
temperature_model: ModuleTemperatureAlgorithm = faiman,
efficiency: float | None = EFFICIENCY_FACTOR_DEFAULT,
mode: SurfacePositionOptimizerMode = Tilt,
method: SurfacePositionOptimizerMethod = l_bfgs_b,
number_of_sampling_points: int = NUMBER_OF_SAMPLING_POINTS_SURFACE_POSITION_OPTIMIZATION,
iterations: int = NUMBER_OF_ITERATIONS_DEFAULT,
precision_goal: float = 0.0001,
shgo_sampling_method=sobol,
workers: int = WORKERS_FOR_SURFACE_POSITION_OPTIMIZATION,
angle_output_units: str = ANGLE_OUTPUT_UNITS_DEFAULT,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
profile: bool = cPROFILE_FLAG_DEFAULT,
)
This function optimizes the position of a surface.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
longitude | float | The longitude of the location. | required |
latitude | float | The latitude of the location. | required |
elevation | float | The elevation of the location. | required |
surface_orientation | SurfaceOrientation | The orientation of the surface. | SurfaceOrientation(value=radians(180), unit='radians') |
surface_tilt | SurfaceTilt | The tilt of the surface. | SurfaceTilt(value=radians(45), unit='radians') |
min_surface_orientation | float | The minimum orientation of the surface allowed. | min_radians |
max_surface_orientation | float | The maximum orientation of the surface allowed. | max_radians |
min_surface_tilt | float | The minimum tilt of the surface allowed. | min_radians |
max_surface_tilt | float | The maximum tilt of the surface allowed. | max_radians |
timestamps | DatetimeIndex | None | The timestamps to use for the optimisation. | DatetimeIndex([now(tz='UTC')]) |
timezone | ZoneInfo | The timezone to use for the optimisation. | ZoneInfo('UTC') |
global_horizontal_irradiance | ndarray | None | The global horizontal irradiance. | None |
direct_horizontal_irradiance | ndarray | None | The direct horizontal irradiance. | None |
spectral_factor_series | SpectralFactorSeries | The spectral factor series. | SpectralFactorSeries(value=SPECTRAL_FACTOR_DEFAULT) |
temperature_series | TemperatureSeries | The temperature series. | TemperatureSeries(value=TEMPERATURE_DEFAULT) |
wind_speed_series | WindSpeedSeries | The wind speed series. | WindSpeedSeries(value=WIND_SPEED_DEFAULT) |
linke_turbidity_factor_series | LinkeTurbidityFactor | The Linke turbidity factor series. | LinkeTurbidityFactor() |
horizon_profile | DataArray | None | The horizon profile. | None |
shading_model | ShadingModel | The shading model. | pvgis |
shading_states | List[ShadingState] | The shading states. | [all] |
photovoltaic_module | PhotovoltaicModuleModel | The photovoltaic module. | CSI_FREE_STANDING |
adjust_for_atmospheric_refraction | bool | Whether to apply atmospheric refraction. | ATMOSPHERIC_REFRACTION_FLAG_DEFAULT |
refracted_solar_zenith | float | None | The refracted solar zenith angle. | UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT |
albedo | float | None | The albedo. | ALBEDO_DEFAULT |
apply_reflectivity_factor | bool | Whether to apply the reflectivity factor. | ANGULAR_LOSS_FACTOR_FLAG_DEFAULT |
solar_position_model | SolarPositionModel | The solar position model. | SOLAR_POSITION_ALGORITHM_DEFAULT |
sun_horizon_position | List[SunHorizonPositionModel] | The sun horizon position. | SUN_HORIZON_POSITION_DEFAULT |
solar_incidence_model | SolarIncidenceModel | The solar incidence model. | iqbal |
zero_negative_solar_incidence_angle | bool | Whether to zero negative solar incidence angles. | ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT |
solar_time_model | SolarTimeModel | The solar time model. | SOLAR_TIME_ALGORITHM_DEFAULT |
solar_constant | float | The solar constant. | SOLAR_CONSTANT |
perigee_offset | float | The perigee offset. | required |
eccentricity_correction_factor | float | The eccentricity correction factor. | required |
peak_power | float | The peak power of the photovoltaic module. | PEAK_POWER_DEFAULT |
system_efficiency | float | None | The system efficiency. | SYSTEM_EFFICIENCY_DEFAULT |
power_model | PhotovoltaicModulePerformanceModel | The power model. | king |
temperature_model | ModuleTemperatureAlgorithm | The temperature model. | faiman |
efficiency | float | None | The efficiency factor. | EFFICIENCY_FACTOR_DEFAULT |
mode | SurfacePositionOptimizerMode | The optimization mode. Available options are | Tilt |
method | SurfacePositionOptimizerMethod | The optimization method. Multiple options are supported including L-BFGS-B, SHGO, CG. | l_bfgs_b |
number_of_sampling_points | int | The number of sampling points. | NUMBER_OF_SAMPLING_POINTS_SURFACE_POSITION_OPTIMIZATION |
iterations | int | The number of iterations. | NUMBER_OF_ITERATIONS_DEFAULT |
precision_goal | float | The precision goal. | 0.0001 |
sampling_method_shgo | SurfacePositionOptimizerMethodSHGOSamplingMethod | The sampling method for the SHGO optimizer. | required |
workers | int | The number of workers. | WORKERS_FOR_SURFACE_POSITION_OPTIMIZATION |
angle_output_units | str | The unit of the angle output. | ANGLE_OUTPUT_UNITS_DEFAULT |
verbose | int | The verbosity level. | VERBOSE_LEVEL_DEFAULT |
log | int | The log level. | LOG_LEVEL_DEFAULT |
fingerprint | bool | Whether to fingerprint the data. | FINGERPRINT_FLAG_DEFAULT |
profile | bool | Whether to profile the function. | cPROFILE_FLAG_DEFAULT |
Returns:
| Name | Type | Description |
|---|---|---|
optimiser_output | OptimizeResult | ndarray | The optimal surface position. |
Source code in pvgisprototype/api/surface/positioning.py
@log_function_call
def optimise_surface_position(
longitude: Longitude,
latitude: Latitude,
elevation: float, # change it to Elevation
#
surface_orientation: SurfaceOrientation = SurfaceOrientation(
value=radians(180), unit="radians"
), # SurfaceOrientation().default_radians
surface_tilt: SurfaceTilt = SurfaceTilt(
value=radians(45), unit="radians"
), # SurfaceTilt().default_radians
min_surface_orientation: float = SurfaceOrientation().min_radians,
max_surface_orientation: float = SurfaceOrientation().max_radians,
min_surface_tilt: float = SurfaceTilt().min_radians,
max_surface_tilt: float = SurfaceTilt().max_radians,
#
timestamps: DatetimeIndex = DatetimeIndex([Timestamp.now(tz="UTC")]),
timezone: ZoneInfo = ZoneInfo("UTC"),
#
global_horizontal_irradiance: ndarray | None = None,
direct_horizontal_irradiance: ndarray | None = None,
spectral_factor_series: SpectralFactorSeries = SpectralFactorSeries(
value=SPECTRAL_FACTOR_DEFAULT
),
temperature_series: TemperatureSeries = TemperatureSeries(
value=TEMPERATURE_DEFAULT
),
wind_speed_series: WindSpeedSeries = WindSpeedSeries(value=WIND_SPEED_DEFAULT),
linke_turbidity_factor_series: LinkeTurbidityFactor = LinkeTurbidityFactor(),
#
horizon_profile: DataArray | None = None,
shading_model: ShadingModel = ShadingModel.pvgis,
shading_states: List[ShadingState] = [ShadingState.all],
#
adjust_for_atmospheric_refraction: bool = ATMOSPHERIC_REFRACTION_FLAG_DEFAULT,
refracted_solar_zenith: float | None = UNREFRACTED_SOLAR_ZENITH_ANGLE_DEFAULT,
albedo: float | None = ALBEDO_DEFAULT,
apply_reflectivity_factor: bool = ANGULAR_LOSS_FACTOR_FLAG_DEFAULT,
solar_position_model: SolarPositionModel = SOLAR_POSITION_ALGORITHM_DEFAULT,
sun_horizon_position: List[SunHorizonPositionModel] = SUN_HORIZON_POSITION_DEFAULT,
solar_incidence_model: SolarIncidenceModel = SolarIncidenceModel.iqbal,
zero_negative_solar_incidence_angle: bool = ZERO_NEGATIVE_INCIDENCE_ANGLE_DEFAULT,
solar_time_model: SolarTimeModel = SOLAR_TIME_ALGORITHM_DEFAULT,
solar_constant: float = SOLAR_CONSTANT,
#
eccentricity_phase_offset: float = ECCENTRICITY_PHASE_OFFSET,
eccentricity_amplitude: float = ECCENTRICITY_CORRECTION_FACTOR,
#
photovoltaic_module: PhotovoltaicModuleModel = PhotovoltaicModuleModel.CSI_FREE_STANDING,
peak_power: float = PEAK_POWER_DEFAULT,
system_efficiency: float | None = SYSTEM_EFFICIENCY_DEFAULT,
power_model: PhotovoltaicModulePerformanceModel = PhotovoltaicModulePerformanceModel.king,
temperature_model: ModuleTemperatureAlgorithm = ModuleTemperatureAlgorithm.faiman,
efficiency: float | None = EFFICIENCY_FACTOR_DEFAULT,
#
mode: SurfacePositionOptimizerMode = SurfacePositionOptimizerMode.Tilt,
method: SurfacePositionOptimizerMethod = SurfacePositionOptimizerMethod.l_bfgs_b,
number_of_sampling_points: int = NUMBER_OF_SAMPLING_POINTS_SURFACE_POSITION_OPTIMIZATION,
iterations: int = NUMBER_OF_ITERATIONS_DEFAULT,
precision_goal: float = 1e-4,
shgo_sampling_method=SurfacePositionOptimizerMethodSHGOSamplingMethod.sobol,
workers: int = WORKERS_FOR_SURFACE_POSITION_OPTIMIZATION,
#
angle_output_units: str = ANGLE_OUTPUT_UNITS_DEFAULT,
#
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
#
verbose: int = VERBOSE_LEVEL_DEFAULT,
log: int = LOG_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
#
profile: bool = cPROFILE_FLAG_DEFAULT,
):
"""
This function optimizes the position of a surface.
Parameters
----------
longitude : float
The longitude of the location.
latitude : float
The latitude of the location.
elevation : float
The elevation of the location.
surface_orientation : SurfaceOrientation
The orientation of the surface.
surface_tilt : SurfaceTilt
The tilt of the surface.
min_surface_orientation : float
The minimum orientation of the surface allowed.
max_surface_orientation : float
The maximum orientation of the surface allowed.
min_surface_tilt : float
The minimum tilt of the surface allowed.
max_surface_tilt : float
The maximum tilt of the surface allowed.
timestamps : DatetimeIndex | None
The timestamps to use for the optimisation.
timezone : ZoneInfo
The timezone to use for the optimisation.
global_horizontal_irradiance : ndarray | None
The global horizontal irradiance.
direct_horizontal_irradiance : ndarray | None
The direct horizontal irradiance.
spectral_factor_series : SpectralFactorSeries
The spectral factor series.
temperature_series : TemperatureSeries
The temperature series.
wind_speed_series : WindSpeedSeries
The wind speed series.
linke_turbidity_factor_series : LinkeTurbidityFactor
The Linke turbidity factor series.
horizon_profile : DataArray | None
The horizon profile.
shading_model : ShadingModel
The shading model.
shading_states : List[ShadingState]
The shading states.
photovoltaic_module : PhotovoltaicModuleModel
The photovoltaic module.
adjust_for_atmospheric_refraction : bool
Whether to apply atmospheric refraction.
refracted_solar_zenith : float | None
The refracted solar zenith angle.
albedo : float | None
The albedo.
apply_reflectivity_factor : bool
Whether to apply the reflectivity factor.
solar_position_model : SolarPositionModel
The solar position model.
sun_horizon_position : List[SunHorizonPositionModel]
The sun horizon position.
solar_incidence_model : SolarIncidenceModel
The solar incidence model.
zero_negative_solar_incidence_angle : bool
Whether to zero negative solar incidence angles.
solar_time_model : SolarTimeModel
The solar time model.
solar_constant : float
The solar constant.
perigee_offset : float
The perigee offset.
eccentricity_correction_factor : float
The eccentricity correction factor.
peak_power : float
The peak power of the photovoltaic module.
system_efficiency : float | None
The system efficiency.
power_model : PhotovoltaicModulePerformanceModel
The power model.
temperature_model : ModuleTemperatureAlgorithm
The temperature model.
efficiency : float | None
The efficiency factor.
mode : SurfacePositionOptimizerMode
The optimization mode. Available options are `Tilt`, `Orientation` and `Orientation & Tilt`.
method : SurfacePositionOptimizerMethod
The optimization method. Multiple options are supported including L-BFGS-B, SHGO, CG.
number_of_sampling_points : int
The number of sampling points.
iterations : int
The number of iterations.
precision_goal : float
The precision goal.
sampling_method_shgo : SurfacePositionOptimizerMethodSHGOSamplingMethod
The sampling method for the SHGO optimizer.
workers : int
The number of workers.
angle_output_units : str
The unit of the angle output.
verbose : int
The verbosity level.
log : int
The log level.
fingerprint : bool
Whether to fingerprint the data.
profile : bool
Whether to profile the function.
Returns
-------
optimiser_output : OptimizeResult | ndarray
The optimal surface position.
"""
if profile:
import cProfile
pr = cProfile.Profile()
pr.enable()
# build reusable parameter dictionaries
coordinates = {
'longitude': longitude,
'latitude': latitude,
}
location_arguments = build_location_dictionary(
**coordinates,
elevation=elevation,
)
time = {
'timestamps': timestamps,
'timezone': timezone,
}
horizontal_irradiance = {
'global_horizontal_irradiance': global_horizontal_irradiance,
'direct_horizontal_irradiance': direct_horizontal_irradiance,
}
irradiance_parameters = {
**horizontal_irradiance,
'spectral_factor_series': spectral_factor_series,
'solar_constant': solar_constant,
}
meteorological_variables = {
'temperature_series':temperature_series,
'wind_speed_series': wind_speed_series,
}
solar_positioning = {
'solar_position_model': solar_position_model,
'adjust_for_atmospheric_refraction': adjust_for_atmospheric_refraction,
'solar_time_model': solar_time_model,
}
shading_parameters = {
'horizon_profile': horizon_profile,
'shading_model': shading_model,
'shading_states': shading_states,
}
solar_incidence_parameters = {
'solar_incidence_model': solar_incidence_model,
'zero_negative_solar_incidence_angle': zero_negative_solar_incidence_angle,
}
photovoltaic_performance_parameters = {
'photovoltaic_module': photovoltaic_module,
'peak_power': peak_power,
'system_efficiency': system_efficiency,
'power_model': power_model,
'temperature_model': temperature_model,
'efficiency': efficiency,
}
earth_orbit = {
'eccentricity_phase_offset': eccentricity_phase_offset,
'eccentricity_amplitude': eccentricity_amplitude,
}
array_parameters = {
"dtype": dtype,
"array_backend": array_backend,
}
output_parameters = {
'verbose': verbose,
'log': log,
}
surface_positioning_arguments = build_surface_position_optimisation_mode(
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
mode=mode,
)
surface_properties = {
'albedo': albedo,
}
# ------------------------------------------
# spectral,
#
#, linke, apply atmospheric refraction
# ------------------------------------------
if not isinstance(global_horizontal_irradiance, ndarray) and not isinstance(
direct_horizontal_irradiance, ndarray
):
direct_horizontal_irradiance = calculate_clear_sky_direct_horizontal_irradiance_series(
# longitude=longitude, # required by some of the solar time algorithms
**location_arguments,
**time,
**solar_positioning,
linke_turbidity_factor_series=linke_turbidity_factor_series,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
solar_constant=solar_constant,
**earth_orbit,
horizon_profile=horizon_profile,
shading_model=shading_model,
**array_parameters,
# validate_output=validate_output,
**output_parameters,
fingerprint=fingerprint,
)
diffuse_horizontal_irradiance = calculate_clear_sky_diffuse_horizontal_irradiance(
**coordinates,
**time,
linke_turbidity_factor_series=linke_turbidity_factor_series,
**solar_positioning,
# unrefracted_solar_zenith=unrefracted_solar_zenith,
solar_constant=solar_constant,
**earth_orbit,
**array_parameters,
**output_parameters,
fingerprint=fingerprint,
)
global_horizontal_irradiance = (
direct_horizontal_irradiance.value
+ diffuse_horizontal_irradiance.value
)
other_input_arguments = build_other_input_arguments_dictionary(
linke_turbidity_factor_series=linke_turbidity_factor_series,
# refracted_solar_zenith=refracted_solar_zenith,
apply_reflectivity_factor=apply_reflectivity_factor,
sun_horizon_position=sun_horizon_position,
#
)
objective_function_arguments = (
location_arguments
| time
| irradiance_parameters
| meteorological_variables
| solar_positioning
| surface_positioning_arguments
| surface_properties
| shading_parameters
| solar_incidence_parameters
| photovoltaic_performance_parameters
| earth_orbit
| other_input_arguments
| output_parameters
)
bounds = define_optimiser_bounds(
min_surface_orientation=min_surface_orientation,
max_surface_orientation=max_surface_orientation,
min_surface_tilt=min_surface_tilt,
max_surface_tilt=max_surface_tilt,
mode=mode,
method=method,
verbose=verbose,
)
optimal_angles: OptimizeResult | ndarray = optimizer(
objective_function_arguments=objective_function_arguments,
func=calculate_mean_negative_photovoltaic_power_output,
method=method,
mode=mode,
bounds=bounds,
number_of_sampling_points=number_of_sampling_points,
iterations=iterations,
precision_goal=precision_goal,
shgo_sampling_method=shgo_sampling_method,
workers=workers,
**output_parameters,
)
# optimal_position = build_optimiser_output(
optimal_position, _optimal_surface_position = build_optimiser_output(
optimiser_output=optimal_angles,
objective_function_arguments=objective_function_arguments,
surface_orientation=surface_orientation,
surface_tilt=surface_tilt,
solar_time_model=solar_time_model,
mode=mode,
method=method,
angle_output_units=angle_output_units,
verbose=verbose,
)
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
from devtools import debug
debug(locals())
log_data_fingerprint(
data=optimal_position,
log_level=log,
hash_after_this_verbosity_level=HASH_AFTER_THIS_VERBOSITY_LEVEL,
)
if fingerprint:
optimal_position[FINGERPRINT_COLUMN_NAME] = generate_hash(optimal_position)
if profile:
import io
import pstats
pr.disable()
# write profiling statistics to file
profile_filename = "profiling_stats.prof"
pr.dump_stats(profile_filename)
print(f"Profiling statistics saved to {profile_filename}")
s = io.StringIO()
sortby = pstats.SortKey.CUMULATIVE
ps = pstats.Stats(pr, stream=s).sort_stats(sortby)
ps.print_stats()
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
print(s.getvalue())
return optimal_position, _optimal_surface_position
power ¶
Functions:
| Name | Description |
|---|---|
calculate_mean_negative_photovoltaic_power_output | Calculate the mean negative photovoltaic power output. |
calculate_mean_negative_photovoltaic_power_output ¶
calculate_mean_negative_photovoltaic_power_output(
surface_angle: tuple,
objective_function_arguments: dict,
mode: SurfacePositionOptimizerMode = Tilt,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
)
Calculate the mean negative photovoltaic power output.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
surface_angle | tuple | The angle(s) of the surface to be optimized. | required |
objective_function_arguments | dict | The arguments to be passed to the function that calculates the photovoltaic power output. | required |
mode | SurfacePositionOptimizerMode | The mode of the optimization. If | Tilt |
Returns:
| Type | Description |
|---|---|
float | The mean negative photovoltaic power output. |
Source code in pvgisprototype/api/surface/power.py
def calculate_mean_negative_photovoltaic_power_output(
surface_angle: tuple,
objective_function_arguments: dict,
mode: SurfacePositionOptimizerMode = SurfacePositionOptimizerMode.Tilt,
dtype: str = DATA_TYPE_DEFAULT,
array_backend: str = ARRAY_BACKEND_DEFAULT,
):
"""
Calculate the mean negative photovoltaic power output.
Parameters
----------
surface_angle : tuple
The angle(s) of the surface to be optimized.
objective_function_arguments : dict
The arguments to be passed to the function that calculates the photovoltaic
power output.
mode : SurfacePositionOptimizerMode
The mode of the optimization. If `SurfacePositionOptimizerMode.Tilt`, the
function will calculate the photovoltaic power output for the given surface
tilt. If `SurfacePositionOptimizerMode.Orientation`, the function will
calculate the photovoltaic power output for the given surface orientation. If
`SurfacePositionOptimizerMode.Orientation_and_Tilt`, the function will
calculate the photovoltaic power output for the given surface orientation and
tilt.
Returns
-------
float
The mean negative photovoltaic power output.
"""
# In order to avoid unbound errors we pre-define `_series` objects
# ---------------------------------------------------------- Update Me ---
array_parameters = {
"shape": objective_function_arguments['timestamps'].shape,
"dtype": dtype,
"init_method": "zeros",
"backend": array_backend,
} # Borrow shape from timestamps
# zero_array = create_array(**array_parameters)
# Update Me --------------------------------------------------------------
photovoltaic_power_output_series = PhotovoltaicPower().create_array(**array_parameters)
if mode == SurfacePositionOptimizerMode.Tilt:
photovoltaic_power_output_series = calculate_photovoltaic_power_output_series(
surface_tilt=surface_angle,
**objective_function_arguments,
)
if mode == SurfacePositionOptimizerMode.Orientation:
photovoltaic_power_output_series = calculate_photovoltaic_power_output_series(
surface_orientation=surface_angle,
**objective_function_arguments,
)
if mode == SurfacePositionOptimizerMode.Orientation_and_Tilt:
photovoltaic_power_output_series = calculate_photovoltaic_power_output_series(
surface_orientation=surface_angle[0],
surface_tilt=surface_angle[1],
**objective_function_arguments,
)
# return the _negative_ power output !
return -(photovoltaic_power_output_series).value.mean()
presentation_example ¶
Example for a single day
recommender ¶
Functions:
| Name | Description |
|---|---|
recommend_surface_position | Provide an initial guess for the optimisation process based on the provided parameters. |
recommend_surface_position ¶
recommend_surface_position(
mode: SurfacePositionOptimizerMode,
latitude: float,
recommended_surface_tilt: float,
recommended_surface_orientation: float = radians(180),
)
Provide an initial guess for the optimisation process based on the provided parameters.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
mode | SurfacePositionOptimizerMode | The mode of the optimisation process. | required |
latitude | float | The latitude of the location of interest. | required |
recommended_surface_tilt | float | The recommended surface tilt for the optimisation process. | required |
recommended_surface_orientation | float | The recommended surface orientation for the optimisation process. Defaults to π. | radians(180) |
Returns:
| Type | Description |
|---|---|
float or list of floats | The initial guess for the optimisation process. |
Source code in pvgisprototype/api/surface/recommender.py
def recommend_surface_position(
mode: SurfacePositionOptimizerMode,
latitude: float,
recommended_surface_tilt: float,
recommended_surface_orientation: float = radians(180),
):
"""
Provide an initial guess for the optimisation process based on the provided parameters.
Parameters
----------
mode: SurfacePositionOptimizerMode
The mode of the optimisation process.
latitude: float
The latitude of the location of interest.
recommended_surface_tilt: float
The recommended surface tilt for the optimisation process.
recommended_surface_orientation: float, optional
The recommended surface orientation for the optimisation process. Defaults to π.
Returns
-------
float or list of floats
The initial guess for the optimisation process.
"""
if mode == SurfacePositionOptimizerMode.Tilt:
# NOTE INITIAL GUESS FOR SURFACE TILT OPTIMISATION
# NOTE SURFACE TILT IS IN RADIANS
# NOTE SURFACE TILT MUST BE 0-180 DEGREES
# NOTE WE CANNOT HAVE NEGATIVE VALUES
return abs(recommended_surface_tilt)
if mode == SurfacePositionOptimizerMode.Orientation:
# NOTE INITIAL GUESS FOR SURFACE ORIENTATION OPTIMISATION
# NOTE SURFACE ORIENTATION IS IN RADIANS
# NOTE SURFACE ORIENTATION MUST BE BETWEEN 0-360 DEGREES
# NOTE WE CANNOT HAVE NEGATIVE VALUES
if latitude < 0: # NOTE In southern hemisphere
return 0 # NOTE radians
return recommended_surface_orientation # NOTE in other case we are in the northern hemisphere
if mode == SurfacePositionOptimizerMode.Orientation_and_Tilt:
# NOTE INITIAL GUESS FOR SURFACE ORIENTATION OPTIMISATION
# NOTE SURFACE ORIENTATION IS IN RADIANS
# NOTE SURFACE ORIENTATION MUST BE BETWEEN 0-360 DEGREES
# NOTE WE CANNOT HAVE NEGATIVE VALUES
if latitude < 0: # NOTE In southern hemisphere
return [
0, # NOTE radians
abs(recommended_surface_tilt), # NOTE radians
]
return [
recommended_surface_orientation, # NOTE radians
abs(recommended_surface_tilt), # NOTE radians
]
tmy ¶
Modules:
| Name | Description |
|---|---|
finkelstein_schafer | |
helpers | |
models | |
plot | |
tmy | |
typical_month | |
weighting_scheme_model | |
weighting_schemes |
|
finkelstein_schafer ¶
Functions:
| Name | Description |
|---|---|
calculate_weighted_finkelstein_schafer_statistics | Calculate the weighted Finkelstein-Schafer statistic for a meteorological |
model_weighted_finkelstein_schafer_statistics | Wrapper API function for calculating Finkelstein-Schafer statistics. |
calculate_weighted_finkelstein_schafer_statistics ¶
calculate_weighted_finkelstein_schafer_statistics(
location_series_data_array: DataArray | Dataset,
meteorological_variable: MeteorologicalVariable,
weighting_scheme: TypicalMeteorologicalMonthWeightingScheme = TYPICAL_METEOROLOGICAL_MONTH_WEIGHTING_SCHEME_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
)
Calculate the weighted Finkelstein-Schafer statistic for a meteorological variable using a weighting scheme.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
location_series_data_array | DataArray | Dataset | Time series data as a xarray read object | required |
meteorological_variable | MeteorologicalVariable | Meteorological variable to calculate TMY | required |
weighting_scheme | TypicalMeteorologicalMonthWeightingScheme | Weighting scheme for the calculation of weights, by default TYPICAL_METEOROLOGICAL_MONTH_WEIGHTING_SCHEME_DEFAULT | TYPICAL_METEOROLOGICAL_MONTH_WEIGHTING_SCHEME_DEFAULT |
Returns:
| Type | Description |
|---|---|
dict | Results in a dictionary including metadata, Finkelstein-Schafer statistic, CDFs and daily statistics |
Source code in pvgisprototype/api/tmy/finkelstein_schafer.py
@log_function_call
def calculate_weighted_finkelstein_schafer_statistics(
location_series_data_array: DataArray | Dataset,
meteorological_variable: MeteorologicalVariable,
weighting_scheme: TypicalMeteorologicalMonthWeightingScheme = TYPICAL_METEOROLOGICAL_MONTH_WEIGHTING_SCHEME_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
):
"""Calculate the weighted Finkelstein-Schafer statistic for a meteorological
variable using a weighting scheme.
Parameters
----------
location_series_data_array : DataArray | Dataset
Time series data as a xarray read object
meteorological_variable : MeteorologicalVariable
Meteorological variable to calculate TMY
weighting_scheme : TypicalMeteorologicalMonthWeightingScheme, optional
Weighting scheme for the calculation of weights, by default TYPICAL_METEOROLOGICAL_MONTH_WEIGHTING_SCHEME_DEFAULT
Returns
-------
dict
Results in a dictionary including metadata, Finkelstein-Schafer statistic, CDFs and daily statistics
"""
(
finkelstein_schafer_statistic,
daily_statistics,
yearly_monthly_ecdfs,
long_term_monthly_ecdfs,
) = calculate_finkelstein_schafer_statistics(location_series_data_array)
# Weighting as per alternative TMY algorithms
typical_meteorological_month_weights = (
get_typical_meteorological_month_weighting_scheme(
weighting_scheme=weighting_scheme,
meteorological_variable=meteorological_variable,
)
)
weighted_finkelstein_schafer_statistic = finkelstein_schafer_statistic * typical_meteorological_month_weights
ranked_finkelstein_schafer_statistic = weighted_finkelstein_schafer_statistic.rank(dim='year', keep_attrs=True)
components_container = {
"Metadata": lambda: {
},
"Finkelstein-Schafer statistic": lambda: {
"Ranked": ranked_finkelstein_schafer_statistic,
"Weighted": weighted_finkelstein_schafer_statistic,
"Weights": typical_meteorological_month_weights,
"Weighting scheme": weighting_scheme,
"Weighting variable": meteorological_variable,
"Finkelstein-Schafer": finkelstein_schafer_statistic,
},
"Cumulative Distribution": lambda: {
LONG_TERM_MONTHLY_ECDFs_COLUMN_NAME: long_term_monthly_ecdfs,
YEARLY_MONTHLY_ECDFs_COLUMN_NAME: yearly_monthly_ecdfs,
},
"Daily statistics": lambda: {
"Daily statistics": daily_statistics,
},
}
components = {}
for _, component in components_container.items():
components.update(component())
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return components
model_weighted_finkelstein_schafer_statistics ¶
model_weighted_finkelstein_schafer_statistics(
time_series: DataArray | Dataset,
meteorological_variable: MeteorologicalVariable,
weighting_scheme: TypicalMeteorologicalMonthWeightingScheme = TYPICAL_METEOROLOGICAL_MONTH_WEIGHTING_SCHEME_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
) -> dict
Wrapper API function for calculating Finkelstein-Schafer statistics.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
time_series | Path | DataArray | Dataset | Time series data as a path, or xarray read object | required |
meteorological_variable | MeteorologicalVariable | Meteorological variable to calculate TMY | required |
longitude | float | Longitude of the selected location | required |
latitude | float | Latitude of the selected location | required |
timestamps | Timestamp | DatetimeIndex | Timestamps to perform calculations, by default Timestamp.now() | required |
start_time | datetime | None | Start timestamp, by default None | required |
periods | int | None | Number of Periods to generate, by default None | required |
frequency | str | None | Frequency for the generation of timestamps given either 1) a start_time and end_time or 2) start_time plus periods _or 3) end_time plus the periods. See also relevant notes in Pandas documentation, by default None | required |
end_time | datetime | None | End Timestamp, by default None | required |
variable_name_as_suffix | bool | Add variable name as suffix, by default True | required |
neighbor_lookup | MethodForInexactMatches | Enable nearest neighbor (inexact) lookups. Read Xarray manual on nearest-neighbor-lookups, by default NEIGHBOR_LOOKUP_DEFAULT | required |
mask_and_scale | bool | Mask and scale the series, by default MASK_AND_SCALE_FLAG_DEFAULT | required |
in_memory | bool | Whether to process data in memory, by default IN_MEMORY_FLAG_DEFAULT | required |
weighting_scheme | TypicalMeteorologicalMonthWeightingScheme | Weighting scheme for the calculation of weights, by default TYPICAL_METEOROLOGICAL_MONTH_WEIGHTING_SCHEME_DEFAULT | TYPICAL_METEOROLOGICAL_MONTH_WEIGHTING_SCHEME_DEFAULT |
Returns:
| Type | Description |
|---|---|
dict | Results in a dictionary including metadata, Finkelstein-Schafer statistic, CDFs and daily statistics |
Source code in pvgisprototype/api/tmy/finkelstein_schafer.py
@log_function_call
def model_weighted_finkelstein_schafer_statistics(
time_series: DataArray | Dataset,
meteorological_variable: MeteorologicalVariable,
weighting_scheme: TypicalMeteorologicalMonthWeightingScheme = TYPICAL_METEOROLOGICAL_MONTH_WEIGHTING_SCHEME_DEFAULT, # type: ignore[assignment]
verbose: int = VERBOSE_LEVEL_DEFAULT,
)->dict:
"""Wrapper API function for calculating Finkelstein-Schafer statistics.
Parameters
----------
time_series : Path | DataArray | Dataset
Time series data as a path, or xarray read object
meteorological_variable : MeteorologicalVariable
Meteorological variable to calculate TMY
longitude : float
Longitude of the selected location
latitude : float
Latitude of the selected location
timestamps : Timestamp | DatetimeIndex, optional
Timestamps to perform calculations, by default Timestamp.now()
start_time : datetime | None, optional
Start timestamp, by default None
periods : int | None, optional
Number of Periods to generate, by default None
frequency : str | None, optional
Frequency for the generation of timestamps given either 1) a start_time and end_time or 2) start_time plus periods _or 3) end_time plus the periods. See also relevant notes in Pandas documentation, by default None
end_time : datetime | None, optional
End Timestamp, by default None
variable_name_as_suffix : bool, optional
Add variable name as suffix, by default True
neighbor_lookup : MethodForInexactMatches, optional
Enable nearest neighbor (inexact) lookups. Read Xarray manual on nearest-neighbor-lookups, by default NEIGHBOR_LOOKUP_DEFAULT
mask_and_scale : bool, optional
Mask and scale the series, by default MASK_AND_SCALE_FLAG_DEFAULT
in_memory : bool, optional
Whether to process data in memory, by default IN_MEMORY_FLAG_DEFAULT
weighting_scheme : TypicalMeteorologicalMonthWeightingScheme, optional
Weighting scheme for the calculation of weights, by default TYPICAL_METEOROLOGICAL_MONTH_WEIGHTING_SCHEME_DEFAULT
Returns
-------
dict
Results in a dictionary including metadata, Finkelstein-Schafer statistic, CDFs and daily statistics
"""
finkelstein_schafer_statistics = calculate_weighted_finkelstein_schafer_statistics(
location_series_data_array=time_series,
meteorological_variable=meteorological_variable,
weighting_scheme=weighting_scheme,
verbose=verbose,
)
return finkelstein_schafer_statistics
helpers ¶
Functions:
| Name | Description |
|---|---|
get_data_variable_from_dataset | Auto-select if exactly one variable exists. |
retrieve_nested_value | Recursively search for a key in a nested dictionary structure. |
set_matplotlib_backend | Configure matplotlib fonts to support Unicode characters. |
get_data_variable_from_dataset ¶
Auto-select if exactly one variable exists.
Source code in pvgisprototype/api/tmy/helpers.py
def get_data_variable_from_dataset(dataset: Dataset) -> str | None:
"""Auto-select if exactly one variable exists."""
data_vars = list(dataset.data_vars)
if not data_vars:
raise ValueError(f"No data variables found in dataset !")
if len(data_vars) == 1:
logger.info(
f"Auto-detected single data variable: {data_vars[0]}",
alt=f"[yellow]Auto-detected single data variable :[/yellow] {data_vars[0]}"
)
return data_vars[0]
# Multiple variables - warn and require explicit choice
logger.warning(
f"⚠️ AMBIGUOUS: Dataset has {len(data_vars)} variables: {data_vars}\n"
+ f"Please specify: --variable '<variable_name>'"
)
return None # Force user to choose
retrieve_nested_value ¶
Recursively search for a key in a nested dictionary structure.
This function performs a depth-first search through nested dictionaries, OrderedDicts, and similar mappings to find the first occurrence of the specified key.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
dictionary | dict | The nested dictionary structure to search | required |
key | str | The key to search for | required |
default | Any | Default value to return if key not found (default: None) | None |
Returns:
| Type | Description |
|---|---|
Any | The value associated with the key, or default if not found |
Examples:
>>> # Works with OrderedDict from build_output()
>>> tmy_value = retrieve_nested_value(output, 'TMY')
Source code in pvgisprototype/api/tmy/helpers.py
def retrieve_nested_value(
dictionary: dict,
key: str,
default: Any = None,
) -> Any:
"""
Recursively search for a key in a nested dictionary structure.
This function performs a depth-first search through nested dictionaries,
OrderedDicts, and similar mappings to find the first occurrence of the
specified key.
Parameters
----------
dictionary : dict
The nested dictionary structure to search
key : str
The key to search for
default : Any, optional
Default value to return if key not found (default: None)
Returns
-------
Any
The value associated with the key, or default if not found
Examples
--------
>>> data = {'a': {'b': {'c': 42}}}
>>> retrieve_nested_value(data, 'c')
42
>>> retrieve_nested_value(data, 'missing', default='Not found')
'Not found'
>>> # Works with OrderedDict from build_output()
>>> tmy_value = retrieve_nested_value(output, 'TMY')
"""
if isinstance(dictionary, dict):
# Direct key match
if key in dictionary:
logger.debug(f"Found key '{key}' at current level")
return dictionary[key]
# Recursively search each value
for _, value in dictionary.items():
result = retrieve_nested_value(value, key, default=None)
if result is not None:
logger.debug(f"Found key '{key}' in nested structure")
return result
logger.debug(f"Key '{key}' not found in structure")
return default
set_matplotlib_backend ¶
Configure matplotlib fonts to support Unicode characters.
Source code in pvgisprototype/api/tmy/helpers.py
def set_matplotlib_backend(verbose: bool = False):
"""Configure matplotlib fonts to support Unicode characters."""
# logger.getLogger('matplotlib.font_manager').setLevel(logging.WARNING)
plt.rcParams["font.family"] = "DejaVu Sans"
plt.rcParams["font.sans-serif"] = ["DejaVu Sans", "Arial", "Helvetica"]
models ¶
Functions:
| Name | Description |
|---|---|
create_combined_enum | Create a combined Enum from multiple bases and add any additional members. |
select_meteorological_variables | Select models from an enum list. |
select_tmy_models | Select models from an enum list. |
create_combined_enum ¶
Create a combined Enum from multiple bases and add any additional members.
Parameters: - name: The name of the enum to create. - bases: Tuple of base enums to combine. - additional_members: Dict of additional members to add to the combined enum.
Returns: - Enum: The newly created enum with combined members.
Source code in pvgisprototype/api/tmy/models.py
def create_combined_enum(name, bases, additional_members=None):
"""
Create a combined Enum from multiple bases and add any additional members.
Parameters:
- name: The name of the enum to create.
- bases: Tuple of base enums to combine.
- additional_members: Dict of additional members to add to the combined enum.
Returns:
- Enum: The newly created enum with combined members.
"""
combined_members = {**bases[0].__members__, **bases[1].__members__}
# If we have additional members, we add them to the combined members
if additional_members:
combined_members.update(additional_members)
return Enum(name, combined_members)
select_meteorological_variables ¶
select_meteorological_variables(
enum_type: Type[MeteorologicalVariable],
meteorological_variables: List[MeteorologicalVariable],
) -> Sequence[MeteorologicalVariable]
Select models from an enum list.
Source code in pvgisprototype/api/tmy/models.py
def select_meteorological_variables(
enum_type: Type[MeteorologicalVariable],
meteorological_variables: List[MeteorologicalVariable],
) -> Sequence[MeteorologicalVariable]:
"""Select models from an enum list."""
if enum_type.all in meteorological_variables:
return [variable for variable in enum_type if variable != enum_type.all]
return [enum_type(variable) for variable in meteorological_variables]
select_tmy_models ¶
select_tmy_models(
enum_type: Type[TMYStatisticModel],
models: List[TMYStatisticModel],
) -> Sequence[TMYStatisticModel]
Select models from an enum list.
Source code in pvgisprototype/api/tmy/models.py
def select_tmy_models(
enum_type: Type[TMYStatisticModel],
models: List[TMYStatisticModel],
) -> Sequence[TMYStatisticModel]:
"""Select models from an enum list."""
if enum_type.all in models:
return [model for model in enum_type if model != enum_type.all]
# return list(models)
# return models
return [enum_type(model) for model in models]
plot ¶
Modules:
| Name | Description |
|---|---|
distribution | |
finkelstein_schafer | |
statistics | |
tmy | |
distribution ¶
Functions:
| Name | Description |
|---|---|
plot_long_term_monthly_ecdfs | Plot and save ECDFs for each month. |
plot_yearly_monthly_ecdfs | Plot and save ECDFs for each month in a 3x4 grid. |
plot_long_term_monthly_ecdfs ¶
Plot and save ECDFs for each month.
Source code in pvgisprototype/api/tmy/plot/distribution.py
def plot_long_term_monthly_ecdfs(
long_term_ecdf,
plot_path="long_term_monthly_ecdfs.png",
):
"""Plot and save ECDFs for each month."""
import matplotlib.pyplot as plt
fig, ax = plt.subplots(figsize=(10, 6))
# ax.plot(long_term_ecdf, label=f'Long-term Monthly')
long_term_ecdf.plot(label='Long-term Empirical CDF')
ax.set_title('Long-term Monthly Empirical Cumulative Distribution Function')
ax.set_xlabel('Value')
ax.set_ylabel('Month')
ax.legend(loc='best')
plt.savefig(plot_path)
plt.close(fig)
plot_yearly_monthly_ecdfs ¶
Plot and save ECDFs for each month in a 3x4 grid.
Source code in pvgisprototype/api/tmy/plot/distribution.py
def plot_yearly_monthly_ecdfs(
yearly_monthly_cdfs,
plot_path="yearly_monthly_ecdfs.png",
):
"""Plot and save ECDFs for each month in a 3x4 grid."""
fig, axes = plt.subplots(nrows=3, ncols=4, figsize=(20, 15)) # 3x4 grid of subplots
axes = axes.flatten() # flatten array to simplify indexing ?
for idx, month in enumerate(yearly_monthly_cdfs.month.values):
ax = axes[idx]
yearly_monthly_cdfs.sel(month=month).plot(ax=ax) # Plot on specific subplot
ax.set_title(f"Month {month}")
ax.set_xlabel("Value")
ax.set_ylabel("ECDF")
plt.tight_layout() # prevent overlap ?
plt.savefig(plot_path)
plt.close(fig)
finkelstein_schafer ¶
Functions:
| Name | Description |
|---|---|
plot_finkelstein_schafer_statistic | Plot and save ECDFs for each month. |
plot_finkelstein_schafer_statistic ¶
plot_finkelstein_schafer_statistic(
finkelstein_schafer_statistic,
plot_path="finkelstein_schafer_statistic.png",
)
Plot and save ECDFs for each month.
Source code in pvgisprototype/api/tmy/plot/finkelstein_schafer.py
def plot_finkelstein_schafer_statistic(
finkelstein_schafer_statistic,
plot_path="finkelstein_schafer_statistic.png",
):
"""Plot and save ECDFs for each month."""
import matplotlib.pyplot as plt
fig, ax = plt.subplots(figsize=(10, 6))
# ax.plot(long_term_ecdf, label=f'Long-term Monthly')
finkelstein_schafer_statistic.plot(label='FS scores')
ax.set_title('Finkelstein-Schafer Statistic)')
ax.set_xlabel('Month')
ax.set_ylabel('Year')
ax.legend(loc='best')
plt.savefig(plot_path)
plt.close(fig)
statistics ¶
Functions:
| Name | Description |
|---|---|
plot_requested_tmy_statistics | Plot the selected models based on the Enum to function mapping. |
plot_requested_tmy_statistics ¶
plot_requested_tmy_statistics(
tmy_series: dict,
variable: str,
statistics: List[TMYStatisticModel],
meteorological_variables: Sequence[
MeteorologicalVariable
],
temperature_series,
relative_humidity_series,
wind_speed_series,
global_horizontal_irradiance,
direct_normal_irradiance,
weighting_scheme: str = "",
limit_x_axis_to_tmy_extent: bool = True,
fingerprint: bool = False,
)
Plot the selected models based on the Enum to function mapping.
Source code in pvgisprototype/api/tmy/plot/statistics.py
def plot_requested_tmy_statistics(
tmy_series: dict,
variable: str,
statistics: List[TMYStatisticModel],
meteorological_variables: Sequence[MeteorologicalVariable],
temperature_series, #: numpy.ndarray = numpy.array(TEMPERATURE_DEFAULT),
relative_humidity_series,
wind_speed_series, #: numpy.ndarray = numpy.array(WIND_SPEED_DEFAULT),
global_horizontal_irradiance, #: ndarray | None = None,
direct_normal_irradiance, #: ndarray | None = None,
weighting_scheme: str = "",
limit_x_axis_to_tmy_extent: bool = True,
fingerprint: bool = False,
):
"""Plot the selected models based on the Enum to function mapping."""
# Map variables to their data series
variable_series_map: Dict[MeteorologicalVariable, any] = {
MeteorologicalVariable.MIN_DRY_BULB_TEMPERATURE: temperature_series,
MeteorologicalVariable.MEAN_DRY_BULB_TEMPERATURE: temperature_series,
MeteorologicalVariable.MAX_DRY_BULB_TEMPERATURE: temperature_series,
MeteorologicalVariable.MEAN_RELATIVE_HUMIDITY: relative_humidity_series,
MeteorologicalVariable.MEAN_WIND_SPEED: wind_speed_series,
MeteorologicalVariable.GLOBAL_HORIZONTAL_IRRADIANCE: global_horizontal_irradiance,
MeteorologicalVariable.DIRECT_NORMAL_IRRADIANCE: direct_normal_irradiance,
}
# Filter map to only variables requested
filtered_variable_map = {
var: data
for var, data in variable_series_map.items()
if var in meteorological_variables
}
# for meteorological_variable in meteorological_variables:
logger.info("Plotting")
for meteorological_variable, time_series in filtered_variable_map.items():
logger.info(
f"- Processing series of {meteorological_variable} based on {time_series.name}",
alt=f"- Processing series of [bold]{meteorological_variable}[/bold] based on [bold]{time_series.name}[/bold]",
)
meteorological_variable_statistics = tmy_series.get(meteorological_variable)
if meteorological_variable_statistics is None:
logger.warning(f"No TMY output for {meteorological_variable}")
continue
for statistic in statistics:
logger.info(f"Statistic {statistic}\n")
if statistic == TMYStatisticModel.tmy:
plot_function = PLOT_FUNCTIONS.get(statistic)
logger.info(
f"- Selected plotting function {plot_function}",
alt=f"- Selected plotting function [code]{plot_function}[/code]"
)
if plot_function is not None:
# Extract the RAW TMY DataArray (not the output dict)
# tmy_dataarray = meteorological_variable_statistics.get(TMYStatisticModel.tmy)
tmy_dataarray = retrieve_nested_value(meteorological_variable_statistics, TMYStatisticModel.tmy.value)
if tmy_dataarray is None:
logger.warning(
f"Warning: TMY data not found for {meteorological_variable}",
alt=f"[red]Warning: TMY data not found for {meteorological_variable}[/red]"
)
continue
typical_months = retrieve_nested_value(meteorological_variable_statistics, 'Typical months')
plot_function(
tmy_series=tmy_dataarray,
variable=variable,
meteorological_variable=meteorological_variable,
finkelstein_schafer_statistic=meteorological_variable_statistics.get(
"Finkelstein-Schafer"
),
typical_months=typical_months,
input_series=time_series,
limit_x_axis_to_tmy_extent=limit_x_axis_to_tmy_extent,
# title=TMYStatisticModel.tmy.name,
title="Typical Meteorological Year",
y_label=meteorological_variable.value,
weighting_scheme=weighting_scheme,
fingerprint=fingerprint,
)
else:
raise ValueError(
f"Plot function for statistic {statistic} not found."
)
elif statistic == TMYStatisticModel.ranked:
plot_function = PLOT_FUNCTIONS.get(statistic.value)
if plot_function is not None:
plot_function(
ranked_finkelstein_schafer_statistic=tmy_series.get(
statistic.value
),
weighting_scheme=weighting_scheme,
)
else:
raise ValueError(
f"Plot function for statistic {statistic} not found."
)
else:
plot_function = PLOT_FUNCTIONS.get(statistic)
if plot_function is not None:
plot_function(tmy_series.get(statistic.value, None))
else:
raise ValueError(
f"Plot function for statistic {statistic} not found."
)
tmy ¶
Functions:
| Name | Description |
|---|---|
plot_tmy | Plot the TMY data with annotations for each month, alongside the original time series. |
plot_tmy ¶
plot_tmy(
tmy_series: Dataset | DataArray,
variable: str,
meteorological_variable: str,
finkelstein_schafer_statistic: DataArray,
typical_months: DataArray,
input_series: DataArray,
input_series_label: str = "Input time series",
weighting_scheme: str = "",
limit_x_axis_to_tmy_extent: bool = True,
title: str = "",
y_label: str = "",
data_source: str = "",
fingerprint: bool = False,
width: int = 16,
height: int = 7,
plot_path: Path = Path(
"typical_meteorological_year.png"
),
to_file=True,
)
Plot the TMY data with annotations for each month, alongside the original time series.
Parameters: tmy: Dataset - The TMY dataset containing 'era5_t2m', 'year', and 'month' coordinates. location_series_data_array: xr.DataArray - The original time series to compare with. title: str - Optional title for the plot.
Source code in pvgisprototype/api/tmy/plot/tmy.py
def plot_tmy(
tmy_series: Dataset | DataArray,
variable: str,
meteorological_variable: str,
finkelstein_schafer_statistic: DataArray,
typical_months: DataArray,
input_series: DataArray,
input_series_label: str = "Input time series",
weighting_scheme: str = "",
limit_x_axis_to_tmy_extent: bool = True, # We rather want it True !
title: str = "",
y_label: str = "",
data_source: str = '',
fingerprint: bool = False,
width: int = 16,
height: int = 7,
plot_path: Path = Path("typical_meteorological_year.png"),
to_file=True,
):
"""
Plot the TMY data with annotations for each month, alongside the original time series.
Parameters:
tmy: Dataset - The TMY dataset containing 'era5_t2m', 'year', and 'month' coordinates.
location_series_data_array: xr.DataArray - The original time series to compare with.
title: str - Optional title for the plot.
"""
# User forgot to specify the `variable` ? Auto-detect it _safely_ !
if variable is None:
if isinstance(tmy_series, DataArray):
variable = tmy_series.name or list(tmy_series.to_dataset().data_vars)[0]
else:
variable = get_data_variable_from_dataset(tmy_series)
if variable is None:
raise ValueError(
f"Data variable unset ! Please specify --variable explicitly."
)
fig, ax = plt.subplots(figsize=(width, height))
# supertitle = getattr(time_series, "long_name", "Untitled")
if input_series.name:
input_series_label = getattr(input_series, "name", input_series_label)
# Plot the original location time series (in gray)
if limit_x_axis_to_tmy_extent:
start_time_in_tmy = Timestamp(tmy_series.time.min().values)
end_time_in_tmy = Timestamp(tmy_series.time.max().values)
input_series: DataArray = input_series.sel(time=slice(start_time_in_tmy, end_time_in_tmy))
# input_series_label += f" (actual extent : {start_time_in_tmy} - {end_time_in_tmy})"
input_series.plot( # type: ignore
color="lightgray",
linewidth=1,
ax=ax,
label=input_series_label,
)
month_colors = [
"#74C2E1", # January - Icy blue for winter
"#85A6D9", # February - Frosty blue hinting at winter's end
"#9AE4B0", # March - Fresh green for early spring
"#A9E4A2", # April - Light green for blooming spring
"#F4E956", # May - Bright yellow for late spring
"#F2BE4A", # June - Warm orange for early summer
"#F58A4E", # July - Hot red-orange for midsummer heat
"#F5A65A", # August - Deep orange for the height of summer
"#D48443", # September - Earthy brown-orange for early autumn
"#B86F32", # October - Deep brown hinting at autumn leaves
"#92672C", # November - Dark brown for late autumn
"#7894B1", # December - Cool blue for winter
]
plotted_months = set()
# Plot the TMY data and annotate with month
# Configure fonts
set_matplotlib_backend(verbose=False)
# for year in unique(tmy_series["year"]): # in case, indent the following !
# for month, color in zip(tmy_series["month"].values, month_colors):
# selected_year = int(typical_months.sel(month=month).values)
# # Which year literally ? ...from finkelstein_schafer_statistic
# year = finkelstein_schafer_statistic.sel(
# year=selected_year
# ).year.values
# tmy_month = tmy_series[variable].dropna(dim="time", how="all").sel(
# year=selected_year, month=month
# )
for month in range(1, 13): # Iterate over 12 months
color = month_colors[month - 1]
# Select data for this month from the TMY
# TMY is already assembled as continuous year, so just filter by month coordinate
mask = tmy_series.month == month
# tmy_month = tmy_series[variable].where(mask, drop=True)
tmy_month = tmy_series.where(mask, drop=True)
if len(tmy_month.time) == 0:
continue # Skip if no data for this month
# Plot only one legend item per month
if month not in plotted_months:
# tmy_month.plot(ax=ax, marker="o", label=f"TMY Month {month}, Year {year}")
tmy_month.plot.line(
ax=ax,
marker="o",
# label=f"Typical month {month}",
# label=f"TMY Month {int(month)}",
color=color,
# linewidth=2,
# markersize=4,
)
plotted_months.add(month)
else:
tmy_month.plot.line(
ax=ax,
marker="o",
color=color,
# linewidth=2,
# markersize=4,
)
# Limit data to period in question
mask = tmy_series.month == month
tmy_month = tmy_series.where(mask, drop=True)
# # Annotate the month at the midpoint of the data for each month
# # Retrieve the middle timestamp of the Typical Month data
# midpoint_idx = len(tmy_month.time) // 2
# midpoint_time = tmy_month.time.values[midpoint_idx]
# # midpoint_time = tmy_month.isel(time=tmy_month.time.size // 2).time.values
# average = float(tmy_month.mean().values)
# import calendar
# month_name = calendar.month_name[month]
# # month_name = calendar.month_name[int(month)]
# ax.annotate(
# text=month_name,
# xy=(midpoint_time, average),
# xytext=(midpoint_time, average + 1),
# ha="center",
# fontsize=10,
# color="0.2",
# )
# Annotate with source YEAR
midpoint_idx = len(tmy_month.time) // 2
midpoint_time = tmy_month.time.values[midpoint_idx]
average = float(tmy_month.mean().values)
# Get source year for this month
source_year = int(typical_months.sel(month=month).values)
ax.annotate(
str(source_year), # Show YEAR not month name
xy=(midpoint_time, average),
xytext=(midpoint_time, average + 1),
ha="center",
fontsize=10,
color="0.2",
weight='bold'
)
# Format x-axis to show month names only (no year)
import matplotlib.dates as mdates
ax.xaxis.set_major_formatter(mdates.DateFormatter('%b'))
ax.xaxis.set_major_locator(mdates.MonthLocator())
ax.grid(True, which="both", linestyle="--", linewidth=0.5)
ax.spines["top"].set_visible(False)
ax.spines["right"].set_visible(False)
ax.spines["bottom"].set_visible(False)
ax.spines["left"].set_visible(False)
# ax.set_xlabel("") ------------------------------->|
# ax.set_xlabel('Typical Meteorological Month')
ax.set_xlabel('Typical Month')
ax.set_ylabel(y_label)
ax.legend(frameon=False)
main_title = title if title else "Typical Meteorological Year"
plt.suptitle(main_title, fontsize=14, color="darkgray", ha="center")
if weighting_scheme:
plt.title(f"{weighting_scheme}", fontsize=12, color="gray")
# Identity
plt.subplots_adjust(bottom=0.18)
identity_text = f"© PVGIS" f" · Joint Research Centre, European Commission"
if data_source:
identity_text += f" · Data source : {data_source}"
if fingerprint:
import re
from pvgisprototype.core.hashing import generate_hash
# data_array_hash = generate_hash(tmy_series[variable])
data_array_hash = generate_hash(tmy_series)
identity_text += f" · Fingerprint : {data_array_hash}"
safe_fingerprint = re.sub(r"[:]", "-", fingerprint) # Replace colons with hyphens
safe_fingerprint = safe_fingerprint.replace(" ", "T") # Ensure ISO format with 'T'
plot_path = plot_path.with_stem(plot_path.stem + f"_{safe_fingerprint}")
fig.text(
0.5,
0.02,
identity_text,
fontsize=12,
color="gray",
ha="center",
alpha=0.5,
)
if to_file:
# output_path = plot_path.with_stem(f"{plot_path.stem}_{variable}_{meteorological_variable.name.lower()}")
output_path = plot_path.with_stem(f"{plot_path.stem}_{meteorological_variable.name.lower()}")
plt.savefig(output_path, bbox_inches="tight")
else:
return fig
tmy ¶
Functions:
| Name | Description |
|---|---|
calculate_tmy | Calculate the Typical Meteorological Year (TMY) |
calculate_weighted_sum | Calculate weighted sum of Finkelstein-Schafer statistics for each variable. |
calculate_tmy ¶
calculate_tmy(
meteorological_variables: Sequence[
MeteorologicalVariable
],
temperature_series,
relative_humidity_series,
wind_speed_series,
global_horizontal_irradiance,
direct_normal_irradiance,
timestamps: Timestamp | DatetimeIndex = now(),
weighting_scheme: TypicalMeteorologicalMonthWeightingScheme = TYPICAL_METEOROLOGICAL_MONTH_WEIGHTING_SCHEME_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
)
Calculate the Typical Meteorological Year (TMY)
Calculate the Typical Meteorological Year using the default ISO 15927-4 standard or other methods.
Notes
ISO 15927-4
The procedure to construct Typical Meteorological Years (TMY) follows the ISO 15927-4 [0]_ standard. For each month in the year, the data are taken from the year calculated as most “typical” for that month. The Standard specifies the method to construct the TMY based on a statistical evaluation of air temperature, relative humidity and solar radiation, with a less important contribution from the wind speed data.
1. For each of the three quantities (air temperature, relative humidity
and solar radiation), calculate the daily means from the hourly values.
2. For each quantity q and each month m, calculate the cumulative
distribution function 𝜙(𝑞,𝑚) using all the daily values for all years.
3. For each quantity q, each year y and each month m, calculate the
cumulative distribution function 𝐹(𝑞,𝑚,𝑦) using all the daily values
for that year.
4. For each q, m and y, calculate the Finkelstein–Schafer statistic,
summing over the range of the distribution values:
𝐹𝑆(𝑞,𝑚,𝑦) = ∑|𝐹(𝑞,𝑚,𝑦) − 𝜙(𝑞,𝑚,𝑦)|. Equation (1) in [1]_
5. For each m and q, rank the the individual months in the multi-year
period in order of increasing 𝐹𝑆(𝑞,𝑚,𝑦).
6. For each m and y, add the ranks for the three quantities.
7. For each m, for the three months with the lowest total ranking,
calculate the deviation of the monthly average wind speed from the
multi-year mean for that month. The lowest deviation in wind speed is
used to select the “best” month to be included in the TMY.
Common algorithm outlined in PVSyst [2]_
Calculate the Typical Meteorological Year based on the following
algorithm:
1. Read _at least_ 10 years of hourly time series over a location
2. Compute daily maximum, minimum and mean of selected variables (cf.
weight list below).
3. Compute the cumulative distribution function (CDF) of each variable
for each month:
3.1 one cumulative distribution function for each variable, each
month and each year of data e.g. for the GHI: one for Jan. 2011,
one for Jan. 2012, one for Jan 2013, ... and for each month the
same for TAmb, or other variables
3.2 one long-term cumulative distribution function for each
variable and each month e.g. one for GHI for January containing all
daily values for 2011 to 2020
4. Compute the weighted sum (WS) of the Finkelstein-Schafer statistic
(FS) for each variable:
4.1 Compute FS, the sum over n days of a month the absolute
difference between the long-term CDF and the candidate month CDF at
value xi
4.2 Compute WS, the weighted sum of FS for each month of each year
5. Rank each months by lowest weighted sum WS (rank every January,
every February, ...)
6. Select each month based on various criteria defined in the different
norms/methods
6.1 The final step for choosing months in the ISO norm is to
compare the wind speed of the best 3 months from the ranked WS to
the long-term average and choose the one with the lowest
difference.
6.2 For the Sandia and NREL methods, the best 5 months
from the ranked WS are re-ranked by their closeness to the
long-term average and median. The 5 months are then filtered by
analyzing the frequency and length of extrema in ambient
temperature and global horizontal irradiance.
7. Concatenate the selected months into a single continuous year (e.g.
Jan 2015, Feb 2011, Mar 2017, etc...), interpolate the values of
different variables at the month boundaries to smooth out
discontinuities.
References
.. [0] International Organization for Standardization (ISO). ISO 15927-4. Hygrothermal Performance of Buildings—Calculation and Presentation of Climatic Data—Part 4: Hourly Data for Assessing the Annual Energy Use for Heating and Cooling; Technical Report; Iternational Organization for Standardization: Geneva, Switzerland, 2005.
.. [1] https://doi.org/10.3390/atmos9020053
.. [2] https://www.pvsyst.com/help/meteo_tmy_algorithms.htm
.. [3] https://www.sciencedirect.com/science/article/pii/S0960148120311009?via%3Dihub
Source code in pvgisprototype/api/tmy/tmy.py
@log_function_call
def calculate_tmy(
# time_series,
meteorological_variables: Sequence[MeteorologicalVariable],
temperature_series, #: numpy.ndarray = numpy.array(TEMPERATURE_DEFAULT),
relative_humidity_series,
wind_speed_series, #: numpy.ndarray = numpy.array(WIND_SPEED_DEFAULT),
# wind_speed_variable: str | None,
global_horizontal_irradiance, #: ndarray | None = None,
direct_normal_irradiance, #: ndarray | None = None,
timestamps: Timestamp | DatetimeIndex = Timestamp.now(),
weighting_scheme: TypicalMeteorologicalMonthWeightingScheme = TYPICAL_METEOROLOGICAL_MONTH_WEIGHTING_SCHEME_DEFAULT,
verbose: int = VERBOSE_LEVEL_DEFAULT,
fingerprint: bool = FINGERPRINT_FLAG_DEFAULT,
):
"""Calculate the Typical Meteorological Year (TMY)
Calculate the Typical Meteorological Year using the default ISO 15927-4
standard or other methods.
Notes
-----
ISO 15927-4
The procedure to construct Typical Meteorological Years (TMY) follows the
ISO 15927-4 [0]_ standard. For each month in the year, the data are taken
from the year calculated as most “typical” for that month. The Standard
specifies the method to construct the TMY based on a statistical evaluation
of air temperature, relative humidity and solar radiation, with a less
important contribution from the wind speed data.
1. For each of the three quantities (air temperature, relative humidity
and solar radiation), calculate the daily means from the hourly values.
2. For each quantity q and each month m, calculate the cumulative
distribution function 𝜙(𝑞,𝑚) using all the daily values for all years.
3. For each quantity q, each year y and each month m, calculate the
cumulative distribution function 𝐹(𝑞,𝑚,𝑦) using all the daily values
for that year.
4. For each q, m and y, calculate the Finkelstein–Schafer statistic,
summing over the range of the distribution values:
𝐹𝑆(𝑞,𝑚,𝑦) = ∑|𝐹(𝑞,𝑚,𝑦) − 𝜙(𝑞,𝑚,𝑦)|. Equation (1) in [1]_
5. For each m and q, rank the the individual months in the multi-year
period in order of increasing 𝐹𝑆(𝑞,𝑚,𝑦).
6. For each m and y, add the ranks for the three quantities.
7. For each m, for the three months with the lowest total ranking,
calculate the deviation of the monthly average wind speed from the
multi-year mean for that month. The lowest deviation in wind speed is
used to select the “best” month to be included in the TMY.
Common algorithm outlined in PVSyst [2]_
Calculate the Typical Meteorological Year based on the following
algorithm:
1. Read _at least_ 10 years of hourly time series over a location
2. Compute daily maximum, minimum and mean of selected variables (cf.
weight list below).
3. Compute the cumulative distribution function (CDF) of each variable
for each month:
3.1 one cumulative distribution function for each variable, each
month and each year of data e.g. for the GHI: one for Jan. 2011,
one for Jan. 2012, one for Jan 2013, ... and for each month the
same for TAmb, or other variables
3.2 one long-term cumulative distribution function for each
variable and each month e.g. one for GHI for January containing all
daily values for 2011 to 2020
4. Compute the weighted sum (WS) of the Finkelstein-Schafer statistic
(FS) for each variable:
4.1 Compute FS, the sum over n days of a month the absolute
difference between the long-term CDF and the candidate month CDF at
value xi
4.2 Compute WS, the weighted sum of FS for each month of each year
5. Rank each months by lowest weighted sum WS (rank every January,
every February, ...)
6. Select each month based on various criteria defined in the different
norms/methods
6.1 The final step for choosing months in the ISO norm is to
compare the wind speed of the best 3 months from the ranked WS to
the long-term average and choose the one with the lowest
difference.
6.2 For the Sandia and NREL methods, the best 5 months
from the ranked WS are re-ranked by their closeness to the
long-term average and median. The 5 months are then filtered by
analyzing the frequency and length of extrema in ambient
temperature and global horizontal irradiance.
7. Concatenate the selected months into a single continuous year (e.g.
Jan 2015, Feb 2011, Mar 2017, etc...), interpolate the values of
different variables at the month boundaries to smooth out
discontinuities.
References
----------
.. [0] International Organization for Standardization (ISO). ISO 15927-4.
Hygrothermal Performance of Buildings—Calculation and Presentation of
Climatic Data—Part 4: Hourly Data for Assessing the Annual Energy Use for
Heating and Cooling; Technical Report; Iternational Organization for
Standardization: Geneva, Switzerland, 2005.
.. [1] https://doi.org/10.3390/atmos9020053
.. [2] https://www.pvsyst.com/help/meteo_tmy_algorithms.htm
.. [3] https://www.sciencedirect.com/science/article/pii/S0960148120311009?via%3Dihub
"""
# For each meteorological variable of
# air temperature, relative humidity and solar radiation
# Map variables to their data series
variable_series_map: Dict[MeteorologicalVariable, any] = {
MeteorologicalVariable.MIN_DRY_BULB_TEMPERATURE: temperature_series,
MeteorologicalVariable.MEAN_DRY_BULB_TEMPERATURE: temperature_series,
MeteorologicalVariable.MAX_DRY_BULB_TEMPERATURE: temperature_series,
MeteorologicalVariable.MEAN_RELATIVE_HUMIDITY: relative_humidity_series,
MeteorologicalVariable.MEAN_WIND_SPEED: wind_speed_series,
MeteorologicalVariable.GLOBAL_HORIZONTAL_IRRADIANCE: global_horizontal_irradiance,
MeteorologicalVariable.DIRECT_NORMAL_IRRADIANCE: direct_normal_irradiance,
}
# Filter map to only variables requested
filtered_variable_map = {
var: data
for var, data in variable_series_map.items()
if var in meteorological_variables
}
results = {}
for meteorological_variable, time_series in filtered_variable_map.items():
logger.info(
f"Processing series of {meteorological_variable.value}",
alt=f"Processing series of [code]{meteorological_variable.value}[/code]"
)
print(f"{meteorological_variable.value}")
# 1 Finkelstein-Schafer statistic for each month and year
finkelstein_schafer_statistics = model_weighted_finkelstein_schafer_statistics(
time_series=time_series,
meteorological_variable=meteorological_variable,
weighting_scheme=weighting_scheme,
verbose=verbose,
)
ranked_finkelstein_schafer_statistic = finkelstein_schafer_statistics.get(
FinkelsteinSchaferStatisticModel.ranked, NOT_AVAILABLE
)
# 2 Select the "typical" year for each month (
typical_months = select_typical_month_iso_15927_4(
ranked_fs_statistic=ranked_finkelstein_schafer_statistic,
wind_speed_series=wind_speed_series,
# wind_speed_variable=wind_speed_variable,
timestamps=timestamps,
verbose=verbose,
)
# After collecting selected months, reassemble into continuous TMY
typical_meteorological_months = []
for month_num in typical_months.month.values:
selected_year = int(typical_months.sel(month=month_num).values)
# Extract the month data from its source year
selected_month_data = time_series.sel(time=f"{selected_year}-{month_num:02d}")
typical_meteorological_months.append(selected_month_data)
# # 4 Merge selected months
# tmy = merge(typical_meteorological_months)
# 4 Concatenate selected months along time dimension
tmy = concat(typical_meteorological_months, dim='time')
# Create synthetic timestamps for a continuous typical year
# Use a reference year (e.g., 2005 or first year in dataset)
reference_year = int(time_series.time.dt.year.min().values)
# Generate new timestamps mapping to the reference year
original_times = tmy.time.values
new_times = []
for _index, original_time in enumerate(original_times):
original_dt = Timestamp(original_time)
# Map to same month/day/hour in reference year
new_time = original_dt.replace(year=reference_year)
new_times.append(new_time)
# Assign the synthetic timestamps
tmy = tmy.assign_coords(time=('time', new_times))
# Add month and year as coordinates for plotting
tmy = tmy.assign_coords(
month=('time', tmy.time.dt.month.values),
year=('time', [reference_year] * len(tmy.time))
)
# Step 5: Wrap in data model and build output
tmy_model = TypicalMeteorologicalVariableYear(
value=tmy.values if hasattr(tmy, 'values') else tmy, # Keep this too
tmy=tmy,
weighting_scheme=weighting_scheme,
finkelstein_schafer_statistics=finkelstein_schafer_statistics,
wind_speed=wind_speed_series,
meteorological_variable=meteorological_variable.value,
typical_months=typical_months,
)
tmy_model.build_output(
verbose=verbose,
fingerprint=fingerprint,
)
# # # 5 Smooth discontinuities between months ? ------------------------
# # tmy_smoothed = tmy.interpolate_na(dim="time", method="linear")
# # --------------------------------------------------------------------
results[meteorological_variable] = tmy_model.output
if verbose > DEBUG_AFTER_THIS_VERBOSITY_LEVEL:
debug(locals())
return results
calculate_weighted_sum ¶
Calculate weighted sum of Finkelstein-Schafer statistics for each variable.
Source code in pvgisprototype/api/tmy/tmy.py
typical_month ¶
Functions:
| Name | Description |
|---|---|
select_typical_month_iso_15927_4 | Select a typical meteorological month for each calendar month using ISO |
select_typical_month_iso_15927_4 ¶
select_typical_month_iso_15927_4(
ranked_fs_statistic,
wind_speed_series,
timestamps=None,
verbose=0,
)
Select a typical meteorological month for each calendar month using ISO 15927-4 method.
ISO 15927-4 specifies that for each calendar month, if using the three top-ranked months (lowest FS statistic), the final selection should be based on wind speed deviation from the multi-year mean.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
ranked_fs_statistic | DataArray | Ranked Finkelstein-Schafer statistics with dimensions (month, year) | required |
wind_speed_series | Dataset | Time series data containing wind speed and other variables | required |
wind_speed_variable | str | Name of the wind speed variable in the dataset | required |
verbose | int | Verbosity level for debugging | 0 |
Returns:
| Name | Type | Description |
|---|---|---|
typical_months | dict | Dictionary mapping calendar month (1-12) to (year, wind_speed_deviation) |
Notes
ISO 15927-4 Step 7 implementation.
For each calendar month:
- Get the 3 candidate months (from different years) with lowest FS ranking
- Calculate wind speed deviation from long-term mean for each candidate
- Select the candidate with smallest wind speed deviation
Source code in pvgisprototype/api/tmy/typical_month.py
@log_function_call
def select_typical_month_iso_15927_4(
ranked_fs_statistic,
wind_speed_series,
# wind_speed_variable=None,
timestamps=None, # Need this to index by time
verbose=0,
):
"""Select a typical meteorological month for each calendar month using ISO
15927-4 method.
ISO 15927-4 specifies that for each calendar month, if using the three
top-ranked months (lowest FS statistic), the final selection should be
based on wind speed deviation from the multi-year mean.
Parameters
----------
ranked_fs_statistic : xarray.DataArray
Ranked Finkelstein-Schafer statistics with dimensions (month, year)
wind_speed_series : xarray.Dataset
Time series data containing wind speed and other variables
wind_speed_variable : str
Name of the wind speed variable in the dataset
verbose : int
Verbosity level for debugging
Returns
-------
typical_months : dict
Dictionary mapping calendar month (1-12) to (year, wind_speed_deviation)
Notes
-----
ISO 15927-4 Step 7 implementation.
For each calendar month:
1. Get the 3 candidate months (from different years) with lowest FS ranking
2. Calculate wind speed deviation from long-term mean for each candidate
3. Select the candidate with smallest wind speed deviation
"""
typical_months = {}
# Extract wind speed values (assuming WindSpeedSeries has .value attribute)
# Adjust this based on your actual WindSpeedSeries structure
wind_speed = (
wind_speed_series.value
if hasattr(wind_speed_series, "value")
else wind_speed_series
)
# For each calendar month (1-12)
for month in range(1, 13):
# if verbose > 1:
# print(f"\nProcessing month {month}")
# Step 7a: Get the 3 months with lowest FS ranking for this calendar month
# ranked_fs_statistic has dims (year, month)
month_fs_scores = ranked_fs_statistic.sel(month=month) # Shape: (n_years,)
# Sort from lowest to highest
sorted_indices = np.argsort(month_fs_scores.values)
# Get indices of 3 lowest FS scores (best candidates)
lowest_3_fs_scores = sorted_indices[:3]
# Get the actual years corresponding to these 3 candidates
candidate_years = ranked_fs_statistic.year.values[lowest_3_fs_scores]
if verbose > 2:
print(f" Candidate years : {candidate_years}")
print(f" Scores : {month_fs_scores.values[lowest_3_fs_scores]}")
# Step 7b: Calculate long-term mean wind speed for this calendar month
# (averaging across all years)
mask_all_years_this_month = timestamps.month == month
long_term_wind_speed_mean = wind_speed[mask_all_years_this_month].mean()
if verbose > 2:
print(
f" Long-term wind speed mean for month {month}: {long_term_wind_speed_mean:.3f}"
)
# Step 7c: For each of the 3 candidates, calculate deviation from long-term mean
wind_speed_deviations = []
for year in candidate_years:
# Select wind data for this specific year-month combination
mask = (timestamps.year == year) & (timestamps.month == month)
this_year_month_wind_speed = wind_speed[mask]
# Calculate mean wind speed for this candidate month
this_year_month_wind_speed_mean = this_year_month_wind_speed.mean()
# Calculate absolute deviation from long-term mean
wind_speed_deviation = abs(
this_year_month_wind_speed_mean - long_term_wind_speed_mean
)
wind_speed_deviations.append(wind_speed_deviation)
if verbose > 2:
print(
f" Year {year} : mean = {this_year_month_wind_speed_mean :.3f}, deviation = {wind_speed_deviation:.3f}"
)
# Step 7d: Select the year with the LOWEST wind speed deviation == typical conditions
lowest_wind_speed_deviation_idx = np.argmin(wind_speed_deviations)
selected_year = candidate_years[lowest_wind_speed_deviation_idx]
typical_months[month] = int(selected_year)
if verbose > 1:
print(
f"- Month/Year {month}/{selected_year} "
f" (Deviation: {wind_speed_deviations[lowest_wind_speed_deviation_idx]:.3f})"
)
# Package as Xarray
typical_months = xr.DataArray(
data=list(typical_months.values()),
dims=["month"],
coords={"month": list(typical_months.keys())},
)
return typical_months
weighting_scheme_model ¶
Functions:
| Name | Description |
|---|---|
get_typical_meteorological_month_weighting_scheme | Retrieve the specific weight or full scheme for a variable under a meteorological month weighting scheme. |
get_typical_meteorological_month_weighting_scheme ¶
get_typical_meteorological_month_weighting_scheme(
weighting_scheme: TypicalMeteorologicalMonthWeightingScheme,
meteorological_variable: (
MeteorologicalVariable | None
) = None,
) -> float | str
Retrieve the specific weight or full scheme for a variable under a meteorological month weighting scheme.
Source code in pvgisprototype/api/tmy/weighting_scheme_model.py
def get_typical_meteorological_month_weighting_scheme(
weighting_scheme: TypicalMeteorologicalMonthWeightingScheme,
meteorological_variable: MeteorologicalVariable | None = None,
) -> float | str:
"""Retrieve the specific weight or full scheme for a variable under a meteorological month weighting scheme."""
if weighting_scheme == TypicalMeteorologicalMonthWeightingScheme.all:
output = []
for scheme_name, scheme_weights in WEIGHTING_SCHEMES.items():
if meteorological_variable:
weight = scheme_weights.get(meteorological_variable)
output.append(
f"{scheme_name.value}: {weight if weight is not None else f'No weight for {meteorological_variable.name}'}"
)
else:
output.append(f"{scheme_name}: {scheme_weights}")
return "\n".join(output)
scheme_weights = WEIGHTING_SCHEMES.get(weighting_scheme)
if not scheme_weights:
raise ValueError(f"No weighting scheme available for {weighting_scheme.name}")
if meteorological_variable:
weight = scheme_weights.get(meteorological_variable)
if weight is None:
raise ValueError(f"No weight defined for '{meteorological_variable.name}' in scheme {weighting_scheme.name}.")
return weight
return scheme_weights # Return the full scheme if no specific variable is requested
weighting_schemes ¶
This file is not used at the moment. It is kept as a reference ! <<<
Variables weight for the FS statistics under each methodology
| Index of daily values | ISO 15927-4_2005 | Sandia Method | NSRDB TMY | |-------------------------------+------------------+---------------+-----------| | Maximum Dry Bulb Temperature | 0 | 1/24 | 1/20 | | Minimum Dry Bulb Temperature | 0 | 1/24 | 1/20 | | Mean Dry Bulb Temperature | 1 | 2/24 | 2/20 | | Maximum Dew Point Temperature | 0 | 1/24 | 1/20 | | Minimum Dew Point Temperature | 0 | 1/24 | 1/20 | | Mean Dew Point Temperature | 0 | 2/24 | 2/20 | | Maximum Wind Velocity | 0 | 2/24 | 1/20 | | Mean Wind Velocity | 0* | 2/24 | 1/20 | | Mean Relative Humidity | 1 | 0 | 0 | | Global horizontal irradiance | 1 | 12/24 | 5/20 | | Direct normal irradiance | 0 | 0 | 5/20 |
iso_15927_4 = "ISO 15927-4_2005" sandia = "Sandia Method" nsrdb = "NSRDB TMY"
utilities ¶
Modules:
| Name | Description |
|---|---|
conversions | |
conversions ¶
Functions:
| Name | Description |
|---|---|
convert_float_to_degrees_if_requested | Convert angle from radians to degrees if requested |
convert_float_to_radians_if_requested | Convert angle from radians to radians if requested |
convert_series_to_degrees_arrays_if_requested | Vectorized conversion of a series of angle data from radians to degrees if requested. |
convert_series_to_degrees_if_requested | Vectorized conversion of a series of angle data from radians to degrees if requested. |
convert_series_to_radians_if_requested | Vectorized conversion of a series of angle data from radians to radians if requested. |
convert_to_degrees | Convert angle to degrees. |
convert_to_degrees_if_requested | Convert angle from radians to degrees if requested |
convert_to_radians | Convert floating point angular measurement from degrees to radians. |
convert_to_radians_fastapi | Convert angle to radians. |
convert_to_radians_if_requested | Convert angle from degrees to radians for a single or an array of custom data structures if requested. |
round_float_values | Recursively round float attributes in a custom data class or any float. |
convert_float_to_degrees_if_requested ¶
Convert angle from radians to degrees if requested
Source code in pvgisprototype/api/utilities/conversions.py
convert_float_to_radians_if_requested ¶
Convert angle from radians to radians if requested
convert_series_to_degrees_arrays_if_requested ¶
convert_series_to_degrees_arrays_if_requested(
data_class_series: List[Any], angle_output_units: str
) -> ndarray
Vectorized conversion of a series of angle data from radians to degrees if requested.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
data_class_series | List[Any] | A list of data classes containing the angle value and unit. | required |
angle_output_units | str | The desired output unit ('degrees' or 'radians'). | required |
Returns:
| Type | Description |
|---|---|
List[Any] | A list of converted data classes. |
Source code in pvgisprototype/api/utilities/conversions.py
def convert_series_to_degrees_arrays_if_requested(
data_class_series: List[Any],
angle_output_units: str,
) -> np.ndarray:
"""
Vectorized conversion of a series of angle data from radians to degrees if requested.
Parameters
----------
data_class_series : List[Any]
A list of data classes containing the angle value and unit.
angle_output_units : str
The desired output unit ('degrees' or 'radians').
Returns
-------
List[Any]
A list of converted data classes.
"""
converted_series = convert_series_to_degrees_if_requested(
data_class_series, angle_output_units
)
# an array of values is friendly (currently) for print_irradiance_table_2()
array = np.array([x.value for x in converted_series])
return array
convert_series_to_degrees_if_requested ¶
convert_series_to_degrees_if_requested(
data_class_series: List | Any, angle_output_units: str
) -> List | Any
Vectorized conversion of a series of angle data from radians to degrees if requested.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
data_class_series | Union[List[Any], Any] | A single data class or a list of data classes containing a | required |
angle_output_units | str | The requested output unit ('degrees' or 'radians'). | required |
Returns:
| Type | Description |
|---|---|
Union[List[Any], Any] | Converted data class or list of converted data classes. |
Source code in pvgisprototype/api/utilities/conversions.py
def convert_series_to_degrees_if_requested(
data_class_series: List | Any,
angle_output_units: str,
) -> List | Any:
"""
Vectorized conversion of a series of angle data from radians to degrees if requested.
Parameters
----------
data_class_series : Union[List[Any], Any]
A single data class or a list of data classes containing a `value`
attribute for an _angular_ quantity and the `unit` attribute.
angle_output_units : str
The requested output unit ('degrees' or 'radians').
Returns
-------
Union[List[Any], Any]
Converted data class or list of converted data classes.
"""
if angle_output_units.lower() != DEGREES.lower():
return data_class_series
def convert_value_of(item):
if hasattr(item, "unit") and hasattr(item, "value"):
# Assuming unit is a string like 'radians' or 'degrees'
if item.unit.lower() != DEGREES.lower():
# debug(item)
# Special case due to the current nested design for HorizonHeight
if hasattr(item,'horizon_height'):
debug(item)
if isinstance(item.horizon_height, HorizonHeight):
item.horizon_height.value = np.degrees(item.horizon_height.value)
# -------------------------------------------- Red-Design Me -
if isinstance(item.value, np.ndarray):
# Numpy array: convert all values
item.value = np.degrees(item.value)
elif isinstance(item.value, (float, int)):
item.value = np.degrees(item.value)
# Optionally handle lists ?
elif isinstance(item.value, list):
item.value = [np.degrees(v) for v in item.value]
item.unit = DEGREES
return item
from copy import deepcopy
copy_of_data_class_series = deepcopy(data_class_series)
if isinstance(copy_of_data_class_series, list):
# Handle list input
return [convert_value_of(item) for item in copy_of_data_class_series]
else:
# Handle single instance input
return convert_value_of(copy_of_data_class_series)
convert_series_to_radians_if_requested ¶
convert_series_to_radians_if_requested(
data_class_series: List[Any], angle_output_units: str
) -> List[Any]
Vectorized conversion of a series of angle data from radians to radians if requested.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
data_class_series | List[Any] | A list of data classes containing the angle value and unit. | required |
angle_output_units | str | The desired output unit ('radians' or 'radians'). | required |
Returns:
| Type | Description |
|---|---|
List[Any] | A list of converted data classes. |
Source code in pvgisprototype/api/utilities/conversions.py
def convert_series_to_radians_if_requested(
data_class_series: List[Any],
angle_output_units: str,
) -> List[Any]:
"""
Vectorized conversion of a series of angle data from radians to radians if requested.
Parameters
----------
data_class_series : List[Any]
A list of data classes containing the angle value and unit.
angle_output_units : str
The desired output unit ('radians' or 'radians').
Returns
-------
List[Any]
A list of converted data classes.
"""
from copy import deepcopy
copy_of_data_class_series = deepcopy(data_class_series)
if angle_output_units == RADIANS:
values_to_convert = np.array(
[
data_class.value
for data_class in copy_of_data_class_series
if data_class.unit != RADIANS
]
)
converted_values = np.radians(values_to_convert)
for i, data_class in enumerate(copy_of_data_class_series):
if data_class.unit != RADIANS:
data_class.value = converted_values[i]
data_class.unit = RADIANS
return copy_of_data_class_series
convert_to_degrees ¶
Convert angle to degrees.
Source code in pvgisprototype/api/utilities/conversions.py
def convert_to_degrees(
ctx: typer.Context, param: typer.CallbackParam, angle: float
) -> float:
"""Convert angle to degrees."""
if ctx.resilient_parsing:
return
if not isinstance(angle, float):
raise typer.BadParameter(
"The input value {angle} for an angular measurement is not of the expected type float!"
)
return np.degrees(angle)
convert_to_degrees_if_requested ¶
Convert angle from radians to degrees if requested
Source code in pvgisprototype/api/utilities/conversions.py
def convert_to_degrees_if_requested(data_class: Any, output_units: str) -> Any:
"""Convert angle from radians to degrees if requested"""
from copy import deepcopy
copy_of_data_class = deepcopy(data_class)
if output_units == DEGREES and not data_class.unit == DEGREES:
copy_of_data_class.value = degrees(data_class.value)
copy_of_data_class.unit = DEGREES
return copy_of_data_class
convert_to_radians ¶
Convert floating point angular measurement from degrees to radians.
Source code in pvgisprototype/api/utilities/conversions.py
def convert_to_radians(
ctx: typer.Context, param: typer.CallbackParam, angle: float
) -> float:
"""Convert floating point angular measurement from degrees to radians."""
if ctx.resilient_parsing:
return
if not isinstance(angle, float):
raise typer.BadParameter("Input should be a float!")
return np.radians(angle)
convert_to_radians_fastapi ¶
convert_to_radians_if_requested ¶
Convert angle from degrees to radians for a single or an array of custom data structures if requested.
Source code in pvgisprototype/api/utilities/conversions.py
def convert_to_radians_if_requested(data_input: Any, output_units: str) -> Any:
"""Convert angle from degrees to radians for a single or an array of custom data structures if requested."""
if output_units != RADIANS:
return data_input
if isinstance(data_input, np.ndarray):
for data_class in data_input:
if data_class.unit != RADIANS:
data_class.value = radians(data_class.value)
data_class.unit = RADIANS
else:
if data_input.unit != RADIANS:
# data_class = replace(data_class, value=radians(data_class.value), unit='radians')
data_input.value = radians(data_input.value)
data_input.unit = RADIANS
return data_input
round_float_values ¶
Recursively round float attributes in a custom data class or any float.
Source code in pvgisprototype/api/utilities/conversions.py
def round_float_values(data, decimal_places=3):
"""Recursively round float attributes in a custom data class or any float."""
if isinstance(data, float):
return round(data, decimal_places)
if isinstance(data, np.floating):
return np.around(
data, decimals=decimal_places
) # See also Notes in numpy.round?
if isinstance(data, np.ndarray) and data.dtype.kind in "if":
# if not data.size == 1:
return np.around(
data, decimals=decimal_places
) # See also Notes in numpy.round?
# else:
# return np.format_float_positional(data, precision=decimal_places)
if isinstance(data, dict):
return {
key: round_float_values(value, decimal_places)
for key, value in data.items()
if not isinstance(value, Enum)
}
if isinstance(data, list):
return [
round_float_values(item, decimal_places)
for item in data
if not isinstance(item, Enum)
]
if hasattr(data, "__dict__") and not isinstance(data, Enum):
for key, value in vars(data).items():
setattr(data, key, round_float_values(value, decimal_places))
return data
return data