This is the 12th day of my participation in the August More Text Challenge. For details, see: August More Text Challenge
Keyboard input:
For keyboard key operations, there are only three operations:
- Key is pressed: input.getKeyDown ();
- Key is released: input.getKeyUp ();
- The key is not released after being pressed: input.getKey ();
- Horizontal, Vertical
Returns the value in the virtual coordinate system based on the axis name. This value ranges from -1 to 1 when using controller and keyboard input
Such as:
public class example : MonoBehaviour {
public float speed = 10.0 F;
public float rotationSpeed = 100.0 F;
void Update()
{
// Get the horizontal and vertical axes, which by default are associated with the arrow keys, with values ranging from -1 to 1
float translation = Input.GetAxis("Vertical") * speed**Time.deltaTime;
float rotation = Input.GetAxis("Horizontal") * rotationSpeed*Time.deltaTime;
// Translate the object along the z axis
transform.Translate(0.0, translation);
// Rotate around our Y-axis
transform.Rotate(0, rotation, 0); }}Copy the code
Mouse input
- Mouse X (horizontal), Mouse (vertical), Mouse ScrollWheel
- Input. GetMouseButtonDown (); Press the trigger
- Input. GetMouseButtonUp (); Loosen the trigger
- Input. GetMouseButton (); Press not to release a trigger parenthesis enter 0 for the left key, 1 for the right key, 2 for the middle key axis enter input.getaxis /GetAxisRaw
public class question1 : MonoBehaviour
{
// When the left mouse button is pressed, the center axis of the mouse is used to realize the scaling of the model.
void Update ()
{
if (Input.GetMouseButton(0)) // Press the left mouse button
{
// Get the scroll amount of the mouse axis
float a = Input.GetAxis("Mouse ScrollWheel");
// Find the fieldOfView property on the Camera component
this.gameObject.GetComponent<Camera>().fieldOfView += a; }}}Copy the code
touch
When running a Unity game on an IOS or Android device, the left mouse button on the desktop system automatically turns into a touchscreen action on the phone screen, but there is no way to use the mouse for things like multi-touch. Unity’s Input class contains not only various Input functions for desktop systems, but also various functions for touch operations on mobile devices. The following describes how to use the Input class for touch operations.
Start with the input-Touches structure, which is an array of touches, with each record representing the state of a finger touching the screen. Each finger touch is described with input.Touches;
Attributes that
- FingerId: The unique index of the touch
- Position: The position of the touch screen
- Deltatime: The time elapsed from the final state to the current state
- TapCount: Number of hits. The Andorid device does not count clicks; this method always returns 1
- DeltaPosition: The screen position changed since the last frame
- Phase: The state of the screen operation
Where phase (state) has the following types:
state | instructions |
---|---|
Began | Just touch the screen with your finger |
Moved | Move your finger across the screen |
Stationary | Fingers touch the screen, but haven’t moved since the last moment |
Ended | Finger off screen |
Canceled | The system disables touch tracking for reasons such as holding the device on the face or having more than five touch points at the same time |