* Initial resources commit * Initial code commit * Added additional resources * Continuing to build holopad and telephone systems * Added hologram shader * Added hologram system and entity * Holo calls now have a hologram of the user appear on them * Initial implementation of holopads transmitting nearby chatter * Added support for linking across multiple telephones/holopads/entities * Fixed a bunch of bugs * Tried simplifying holopad entity dependence, added support for mid-call user switching * Replaced PVS expansion with manually networked sprite states * Adjusted volume of ring tone * Added machine board * Minor features and tweaks * Resolving merge conflict * Recommit audio attributions * Telephone chat adjustments * Added support for AI interactions with holopads * Building the holopad UI * Holopad UI finished * Further UI tweaks * Station AI can hear local chatter when being projected from a holopad * Minor bug fixes * Added wire panels to holopads * Basic broadcasting * Start of emergency broadcasting code * Fixing issues with broadcasting * More work on emergency broadcasting * Updated holopad visuals * Added cooldown text to emergency broadcast and control lock out screen * Code clean up * Fixed issue with timing * Broadcasting now requires command access * Fixed some bugs * Added multiple holopad prototypes with different ranges * The AI no longer requires power to interact with holopads * Fixed some additional issues * Addressing more issues * Added emote support for holograms * Changed the broadcast lockout durations to their proper values * Added AI vision wire to holopads * Bug fixes * AI vision and interaction wires can be added to the same wire panel * Fixed error * More bug fixes * Fixed test fail * Embellished the emergency call lock out window * Holopads play borg sounds when speaking * Borg and AI names are listed as the caller ID on the holopad * Borg chassis can now be seen on holopad holograms * Holopad returns to a machine frame when badly damaged * Clarified some text * Fix merge conflict * Fixed merge conflict * Fixing merge conflict * Fixing merge conflict * Fixing merge conflict * Offset menu on open * AI can alt click on holopads to activate the projector * Bug fixes for intellicard interactions * Fixed speech issue with intellicards * The UI automatically opens for the AI when it alt-clicks on the holopad * Simplified shader math * Telephones will auto hang up 60 seconds after the last person on a call stops speaking * Added better support for AI requests when multiple AI cores are on the station * The call controls pop up for the AI when they accept a summons from a holopad * Compatibility mode fix for the hologram shader * Further shader fixes for compatibility mode * File clean up * More cleaning up * Removed access requirements from quantum holopads so they can used by nukies * The title of the holopad window now reflects the name of the device * Linked telephones will lose their connection if both move out of range of each other
67 lines
2.8 KiB
C#
67 lines
2.8 KiB
C#
using Content.Shared.Chat.Prototypes;
|
|
using Robust.Shared.Audio;
|
|
using Robust.Shared.GameStates;
|
|
using Robust.Shared.Prototypes;
|
|
|
|
namespace Content.Shared.Speech
|
|
{
|
|
/// <summary>
|
|
/// Component required for entities to be able to speak. (TODO: Entities can speak fine without this, this only forbids them speak if they have it and enabled is false.)
|
|
/// Contains the option to let entities make noise when speaking, change speech verbs, datafields for the sounds in question, and relevant AudioParams.
|
|
/// </summary>
|
|
[RegisterComponent, NetworkedComponent, AutoGenerateComponentState]
|
|
public sealed partial class SpeechComponent : Component
|
|
{
|
|
[DataField, AutoNetworkedField]
|
|
[Access(typeof(SpeechSystem), Friend = AccessPermissions.ReadWrite, Other = AccessPermissions.Read)]
|
|
public bool Enabled = true;
|
|
|
|
[ViewVariables(VVAccess.ReadWrite)]
|
|
[DataField]
|
|
public ProtoId<SpeechSoundsPrototype>? SpeechSounds;
|
|
|
|
/// <summary>
|
|
/// What speech verb prototype should be used by default for displaying this entity's messages?
|
|
/// </summary>
|
|
[ViewVariables(VVAccess.ReadWrite)]
|
|
[DataField]
|
|
public ProtoId<SpeechVerbPrototype> SpeechVerb = "Default";
|
|
|
|
/// <summary>
|
|
/// What emotes allowed to use event if emote <see cref="EmotePrototype.Available"/> is false
|
|
/// </summary>
|
|
[ViewVariables(VVAccess.ReadWrite)]
|
|
[DataField]
|
|
public List<ProtoId<EmotePrototype>> AllowedEmotes = new();
|
|
|
|
/// <summary>
|
|
/// A mapping from chat suffixes loc strings to speech verb prototypes that should be conditionally used.
|
|
/// For things like '?' changing to 'asks' or '!!' making text bold and changing to 'yells'. Can be overridden if necessary.
|
|
/// </summary>
|
|
[DataField]
|
|
public Dictionary<string, ProtoId<SpeechVerbPrototype>> SuffixSpeechVerbs = new()
|
|
{
|
|
{ "chat-speech-verb-suffix-exclamation-strong", "DefaultExclamationStrong" },
|
|
{ "chat-speech-verb-suffix-exclamation", "DefaultExclamation" },
|
|
{ "chat-speech-verb-suffix-question", "DefaultQuestion" },
|
|
{ "chat-speech-verb-suffix-stutter", "DefaultStutter" },
|
|
{ "chat-speech-verb-suffix-mumble", "DefaultMumble" },
|
|
};
|
|
|
|
[DataField]
|
|
public AudioParams AudioParams = AudioParams.Default.WithVolume(-2f).WithRolloffFactor(4.5f);
|
|
|
|
[ViewVariables(VVAccess.ReadWrite)]
|
|
[DataField]
|
|
public float SoundCooldownTime { get; set; } = 0.5f;
|
|
|
|
public TimeSpan LastTimeSoundPlayed = TimeSpan.Zero;
|
|
|
|
/// <summary>
|
|
/// Additional vertical offset for speech bubbles generated by this entity
|
|
/// </summary>
|
|
[DataField]
|
|
public float SpeechBubbleOffset = 0f;
|
|
}
|
|
}
|