Browsed by
Category: Programming

Appium and Protractor for real device automated testing (Mac OS/iOS 10.3+)

Appium and Protractor for real device automated testing (Mac OS/iOS 10.3+)

Recently, I had a client who requested their protractor integration tests for a Angular application be set-up so that they are capable of running on a local testing environment using real device hardware. After going through the official ‘Getting Started’ guides for Appium, I felt they did not provide enough detail to get things running smoothly with Protractor and a web application. This guide should provide a comprehensive one-stop solution if you are having problems setting up Appium and Protractor with a web application. This article will only focus on setting up a local test environment for Mac OS using Appium, Protractor and a iPad Mini connected to the Mac OS with a USB cable.

Java, XCode and Homebrew Setup Guide

  1. The first step is to install the Java JDK from the official Oracle page and only has to be done if you do not yet have the Java JDK installed. Go to the Java JDK page and select the Java SE Development Kit for your operating system. For this article, I will be using Mac OS X. Once the binary has finished downloading, install the Java JDK.
  2. The next step is to install XCode. Only follow this step if you do not have XCode installed. The easiest way to install this would be to go to the Mac App Store and search for ‘XCode’.
  3. Once XCode is installed, we need an optional component called XCode Command-line tools. You can install these by opening up the terminal and typing xcode-select --install
  4. Next, install Homebrew by entering the following into the terminal:  ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
  5. Update our bash profile:  nano ~/.bash_profile and then enter  export JAVA_HOME=$(/usr/libexec/java_home) Save the file and exit.
  6. At this point, it might be a good idea to close your current terminal window and fire up a new instance of one.

Install Node and Appium

  1. With Homebrew installed, we gain access to the helpful  brew  terminal command. Let’s update Homebrew by entering  brew update .
  2. Next, install Node:  brew install node
  3. Once Node is installed, it also installs the Node Package Manager (NPM). Let’s update npm:  npm install -g npm Note: If you receive an error like ‘npm command not found’, try closing your current terminal window and opening another.
  4. Once our initial dependencies are met, it’s time to install Appium. Enter  npm install -g appium in the terminal window to install appium globally.
  5. Next, install wd.js with npm install wd  wd is a library designed to be a maleable implementation of the webdriver protocol in Node, exposing functionality via a number of programming paradigms.
  6. Next we need Carthage, a web server and opinionated framework for building data manipulation-centric API services in Node.js for web, mobile or IoT apps.
    1. cd /usr/local/lib/node_modules/appium/node_modules/appium-xcuitest-driver/WebDriverAgent
    2. brew install carthage
    3. npm i -g webpack
    4. ./Scripts/bootstrap.sh -d
  7. Appium has another dependency when doing real-device testing, libimobiledevice, a cross-platform software library that talks the protocols to support iPhone®, iPod Touch®, iPad® and Apple TV® devices. Install it by entering  brew install libimobiledevice into the terminal

Verify Prerequisites are met with Appium Doctor

  1. appium-doctor is a handy utility which verifies you have all of the necessary dependencies top properly run Appium against the environment you specify.  npm install -g appium-doctor to install Appium Doctor
  2. Run appium doctor with  appium-doctor --ios
  3. Examine the output. If Appium Doctor complains that you are missing something, verify you never missed a step.

Install iOS Deploy and iOS Debug Proxy

  1. At this point, we have all of the dependencies to run Appium. However, for automated testing on a iPad device we need additional dependencies in order to properly run our Protractor integration tests. The first of these is ios-deploy. Install it through npm:  npm install -g ios-deploy
  2. Finally, we need the iOS WebKit Debug Proxy (iwdp) which proxies requests from usbmuxd daemon over a websocket connection, allowing developers to send commands to MobileSafari and UIWebViews on real and simulated iOS devices. Install it via  brew install ios-webkit-debug-proxy

Setting up Protractor Capabilities

  1. For real (local) device testing, you need a valid  xcodeOrgId ,  xcodeSigningId and  udid in your protractor capabilities
    1. xcodeOrgId: APPLE TEAM ID
    2. xcodeSigningId: ‘iPhone Developer’
    3. autoWebview: true,
    4. udid: ‘auto’
  2. The Apple Team ID can be found on the Apple Developer page. Do not change the xcodeSigningId value.
  3. The udid is the unique ID for the real iOS device you have connected to your Mac. Set the value to ‘auto’ for automatic detection

Putting it all together

  1. Once everything is installed, let’s start the appium REST server and the iOS Debug Proxy. First, appium can be started by entering  appium in the terminal. Give it some time to boot up.
  2. Launch the iOS Debug Proxy  ios_webkit_debug_proxy -c null:27753,:27753-27853
  3. Run your Protractor tests:  protractor conf.ipad-dev.js --baseUrl="http://myUrl.com" --specs="path/to/my/test"
  4. For completeness, here are my protractor capabilities:

 

Note: If you get ‘cannot find webview, retrying.’ errors ad infinitum, do the following:

brew reinstall --HEAD libimobiledevice

brew reinstall -s ios-webkit-debug-proxy

Conclusion

That’s it! Your Protractor tests should now be running on a real device. I ran into a lot of trouble getting Appium set-up for real devices, and every source I found on the web only had pieces of the final solution. This article was written with the fabulous help of many people and I hope it can provide some measure of assistance if you are experiencing the same issues I was when trying to get Appium running with Protractor and a real iOS 10.3+ device (iPad Mini 2).

Weekly Content Blog #33: What’s new?

Weekly Content Blog #33: What’s new?

Wow, hard to believe we’re already at Week 33! Initially, this Development Blog was to record all of the week-to-week changes as production continued, and note any new additions (Or subtractions) of the core content and features of the game as time went on. Instead, this development log evolved to be more than that. We delved into musical influences of past composers who influenced the sound of Shadows of Adam, we have a three-part series about budgeting the art costs of an RPG, we have an article taking a look at some of the retro flaws afflicting some classics, we have level design articles and we even have some boring technical articles. Looking through our post history, I noticed we don’t have many articles where we talk about what we actually worked on between posts, so I am dedicating this week “Development Log” towards a changelog of what changed since my last post on September 22, 2015, and quite a lot has changed!

New Menu Skin

The old item menu
The old item menu

menu-item-new,pngmenu-newmenu-save-1menu-save-2

We wanted the ability to change window skins (ala Final Fantasy window colors), and unfortunately the old menu skins were simply a single image texture that had text aligned to match the image. This doesn’t give us any flexibility at all, and makes it impossible to add any movement animations to individual windows since it’s all one big image! Well, that was all fixed! We introduced Scale9 image support and completely re-worked the UI so we can add multiple different window colors, and have it be an option should the player wish it. I think it looks nicer, and it’s easier to work with as well! Still some alignment issues to do, but nothing major 🙂

Area Text Enhancement

area-intro

(Image is a GIF) You know in -some- classic RPGs when you enter a new area, a helpful box appears telling you the area name? We always had that simple feature in, but the area box had to be called using our scene system before it would appear. As such, it was only being used during certain puzzles. We improved this functionality so the area intro actually acts as an area intro, but only fires off once. You want to always be told which area you’re in, and don’t want to talk to the nearest town guard? I’m sure I can make an option for that.

Named (string) IDs instead of numerical IDs

I still find it hard to believe this ‘developer sanity’ feature wasn’t pushed more heavily by the team. Basically, we were using numerical IDs instead of named IDs for EVERYTHING. You want to add a treasure chest to a map? You have to go into the item data files, find the name of the item or piece of equipment you want to give the player, collect it’s ID and then attach that number to the chest object. Oh, what’s that you say? You can collect the item continuously and the chest never goes into it’s ‘Opened’ state? Wellll, for that you have to go into our ‘Treasure IDs’ worksheet on Google Sheets, assign a ‘treasure switch ID’ to your chest in the sheet (To prevent someone from using the same ID) and then assign that number to the chest. BOOM! Your treasure chest is now working! With a named ID, now you just reference the name of the equipment by it’s name, give it a opened state name and BOOM done. No more going back and forth spending coveted development time searching for the ID of the “Basic Armor”, or navigating development folders trying to find the treasure chest switch ID list. Now, it’s all automatic.

Why we never did this sooner? We’re stubborn here in SC land 😉

Quest Flow Engine

quest-flow-1quest-flow-2quest-flow-3quest-flow-4

I can’t believe it took us this long to get this feature in. Our ‘de-facto’ way of handling quest flow for the game is by using boolean switches. Nothing wrong with that, right? Well, our switches (and variables) in the game suffered from the same problem mentioned above. It was all numeric numbers, and was not stored in a human readable way like an actual WORD. You meet a witch in the woods? She turns switch 113 to 1. If switch 113 is 1, then a new monster spawns and turns switch 127 to 1 and 114 to 0. These numbers mean nothing to me (And I’m one of the developers!) Now, with our named IDs, it is very easy to manage our quest flow. Our ‘Quest Flow Engine’ consists of two parts:

  • 1.) The controller
  • 2.) The trigger
  • The Controller is a special object that can be activated by any other game object. All it does is control the state of the game. If your quest flow looks like: ‘talk to Bob’ THEN ‘innkeeper lets you into the secret room’, then once the player talks to Bob, the ‘Bob’ game object tells the controller “Hey, the player talked to me! Set the “Talked_To_Bob” state to true!”
    The trigger is made up of two parts: The condition and the logic. A single trigger can have any number of conditions or logic associated with it. If the Controller has the power over the game state, the trigger has the power over the properties and states of each game object. So using the prior example, the trigger could be waiting for the “Talked_To_Bob” condition to be met, and once met, fires off some logic which updates the ‘Innkeeper’ NPC to allow the player entry into the back rooms.

    Joypad Support + Control key rebinding

    Joypad support was in for awhile, but we never really talked about it. Shadows of Adam supports any Gamepad that uses the XInput API. The standard gamepad using this API is the Xbox 360 controller. If you have an older gamepad that uses DirectInput instead of XInput, we don’t support it natively, but you -may- be able to use additional software to convert the DirectInput signals into XInput signals. Further, we’ve also added in key re-binding. This is a must-have feature in my opinion. There is no way every single player of your game will enjoy the default key bindings, so having the option to re-bind goes a long way!

    Skill Leveling

    Each character has seven core skills, with the exception of Asrael who has eight. We wanted to avoid the level-up mechanic found in classic jRPGs where a party member learns a more powerful version of an already learned skill (Fire 1, Fire 2, Fire 3, etc), so at the end of the game the skill list is populated with a large amount of identical skills of varying strength. Instead, we decided to use a character MP pool, with each skill taking a certain percentage of the characters available pool and a base percentage is recovered each round. Each skill becomes more powerful as a character gains higher attribute points, and the cost of casting a skill remains mostly fixed. I say ‘mostly’ because we wanted the player to be able to customize the builds as each hero becomes more powerful. So we took a page from Super Mario RPG. Every time a hero levels up, the player can choose a single level-up bonus. These bonuses can be a stat boost, a reduction in the cost to cast a skill, and once in a while; a Skill Levelup, which changes the functionality of a single skill to make it much more effective and useful. Every skill can be upgraded three times, but the final “Ultimate” version of the skill may require more than just an option select 😉

    That’s it for this week, I hope you enjoyed this post :). If you have any suggestions on what you would like to see us write about, let us know either in the comments below or on one of our social media accounts, and we’ll see what we can do 😀

    See you next week,

    Tyler

    Weekly Content Blog #23: Post-processing

    Weekly Content Blog #23: Post-processing

    So after going over your level design budget trying to figure out how those darn attractive tiles connect to each other, you finally figure it out and have a level. Congratulations! Your game is complete and now you can release and retire to a Nepalese Monk Monastery. Wait, what do you mean the level is bare and has none of that extra juice?

    magma-sanctum

    You let out a loud groan of despair and shake your fist at the user who ruined your retirement plans. As you return to the lowly confines of your office, you plan how you’re going to make the level built by Luke the Wondrous the most amazing level yet seen in the game. You’re young, idealistic and invincible. What can you add that suits your greatness? Which type of effect can be found littered throughout every triple A game? That’s right, post-processing effects.

    scene-blur

    Nothing can bring you down now, for you possess the power of post-processing effects! Now let’s see them say there is nothing to do! Now let’s see if they can find all the secret effects littered throughout the level to unlock the secret super badazz effect boss that can only be beaten with the power of love.

    Shadows of Adam is using two methods to handle post-processing effects; the first, and main method, is using WebGL to render advanced effects in real-time that require full image awareness. Our second, and fallback/OMGNOWEBGL, method is to use native canvas API capabilities to either fake the desired result through hackery, or imitate the effect exactly, but with a noticeable performance hit. Our planned minimum hardware specifications for release is a AMD E3-350 CPU with a AMD Radeon HD 6310 video card; This is pretty low-end hardware, and is primarily found in cheap notebooks or media/playback machines, so hopefully the vast majority of the players will not have any problem playing the game with all of the post-processing effects firing off at it’s highest quality setting.

    I initially had a lengthy article planned about how these effects work in our game, but this post is already a few hours behind schedule and I’ve gone and borked the entire game:
    borked

    and I may be using a tiny bit too much VRAM
    Le-sigh

    So I’ll leave a bunch of images at the bottom, along with a GIF, and use the time-tested excuse of “It’s a feature” and see myself out….

    lens-blur

    vibrance-1

    sharpness-adjustment

    Okay fine, I’ll fix it. Only because you, the reader, are awesome and deserve a product that works.

    Weekly Content Blog #18: Lights, PFX and Concepts

    Weekly Content Blog #18: Lights, PFX and Concepts

    Hello everyone!

    Today I want to talk about some of the tasks I was working on since my last blog post on June 23 (https://www.somethingclassic.net/weekly-content-blog-15-not-invented-here/). Last time, I was working on a particle effect implementation for ‘Shadows of Adam’ so we can quickly create new visual effects cheaply to add more depth and detail to our game world, and that extra spit of polish to some of our game systems. Since then, I have primarily been focused on figuring out an adequate lighting solution that blends well with our art style (Read: Not detailed so the lighting sticks out from our art, but also not too ‘pixely’ it’s a simple shape) and that’s been quite a challenge. We’re still working on it as a team to this day, but I’ll share some concepts of methods we have tried so far. Next, I’m going to talk more particle effects and discuss in detail how they work in SoA, and what we plan on using them for. First, let’s share some lighting tests we ran:

    Raycaster-lightsThe first concept test we ran was a raycaster-based lighting system. The test we ran is actually an Impact plugin created by Marc Henklein, with minor modifications. The approach here is to shoot multiple rays from a central position into the game world and create a circle (*or semi-circle, depending on our angle settings) and then fill in the circular shape with a specific color. Next, we take this shape and draw it over our dark mask, essentially punching a see-through hole in our otherwise blinding curtain. Finally, we draw the resulting image over our game layers to provide the Light/Shadow effect. Simple, and works well. However, the lights don’t look very good in that screenshot. They’re passable, but we need to do a few more passes and have the lights blend better with the game world so they look more realistic and less ridiculous.

    lights-2 The next lighting method we tried is more demanding graphically, but yields much better results (As you’d expect, eh? 😉 ). Basically, we create a radial gradient in an off screen canvas element and then draw the gradient image onto our game world wherever our ‘Light’ entity exists, and finally draw a dark mask over all of our game layers to finish the effect. The end result is what appears in the screenshot to the right, and the shots below. A very quick implementation that is decently fast, and allows for many lights in a scene, but the cost of using this method is we lose control of fine-tuning how our light should look, and the resulting lights look slightly out of place in comparison to our art. We have to find the perfect balance so we can show off dynamic lighting effects, but have them blend seamlessly with our environments. It’s a tough problem and one we are still working on!

    lights-2lights-3lights-4

    I spent the entire last blog post talking about particle effects (‘PFX’), so I won’t go over too much that wasn’t mentioned before so I don’t sound like a broken record. So today, I’m going to talk briefly about the technical details of what makes our PFX system tick.

    pfx-texturesOur particle system was built to be data-driven, with every particle being enclosed in a JSON particle file. All we need to do to add a particle effect into the game world is 1.) Position an ‘Emitter’ entity and 2.) Give the ‘Emitter’ entity the name of the particle effect to use. From there, all settings are controlled directly from the particle file. The structure looks like this:

    Once a particle file is loaded into the emitter, we next create an off screen canvas element. This off screen canvas will act as our renderer; instead of drawing particles directly to the visible game world, we first render all of our particle effects off screen in memory, and then only draw the resulting image into the game world. In the picture to the right, the blue square represents the border for our render space, and the red border are the boundaries for the emitter itself. By rendering our particles this way, we give ourselves some extra flexibility on how we want to control both our emitters and our particles. Since our Emitter is an entity, we can control it like we would any other interactive object in the game and tween it, animate it and position it however we want. Further, since our particle render target is its own separate image, we can animate it and tween it independently of our emitter. Below, you’ll find a real-world example of our PFX in action. In this screenshot, we are using it to create smoke from the Chimneys in the town of “Adam”
    pfx-1
    If anyone has any questions, or has an opinion on some of the images in this article, please let us know what you think in either the ‘Comments’ section or the ‘Contact-us’ option at the top of the page. We’d love to hear your opinions!

    To close, I’d like to show off a sneak-peak of some of the new UI concept work Tim has been working on. It’s his turn to post next week, so a little teaser should be good 🙂
    new-battle-ui

    Weekly Content Blog #15: “Not Invented here”

    Weekly Content Blog #15: “Not Invented here”

    Particle effects play a huge role in modern video games. Not only do they look fabulous and really add that extra juice to a scene, they are trivial to build… at least on paper. The tricky part about a particle engine is optimizing the creation and destruction of the particle objects, in addition to making those particle objects render fast within your rendering pipeline. The requirements Tim had for a particle engine were pretty standard (Basically, everything the Starling game engine could do. See an example here: http://onebyonedesign.com/flash/particleeditor/). I am also going to try presenting this post mostly in images (And GIFs) to show the progress from the beginning to the final product with some short spiel, so beware! Lots of images and GIFs!

    step1 The first step was to flesh out our emitter and particle classes. It obviously doesn’t look like much, and chances are the thumbnail appears only as a black box. (Click the thumbnail to enlarge it). The performance at this stage was incredibly poor; we achieved a meager 300 particles per 3.00ms. If we want to hit a frames-per-second rate of 60FPS, then we need to cram collision detection, drawing, particles, scene composition, pathfinding and entity processing in sixteen milliseconds. Having to allocate 1.00ms per 100 particles of our 16ms budget isn’t good enough. So the next step is to add an object-pool. If you’re not familiar with object pooling, what we do is create a large ‘pool’ of objects and then recycle these objects for our particles. Whenever we draw a new particle to the screen, we request a blank particle object from our initiated pool of particles, set the desired attributes and push it to our draw list. The advantage this has is then we do not have to create a new Particle class everytime a new particle is created. Constantly creating and destroying the same thing repeatedly is both expensive and wasteful. Once a particle is destroyed, instead of marking it for memory clean-up, we maintain a reference to it and add it back into our object pool.

    step3
    Another bland looking thumbnail image. It will get better, I promise! With our object pool in place, our particle engine now performs at 1,000 particles per 3.00ms. Compared to some powerful particle engines that can push out millions of particles with ease, ours is terrible. But 1,000/3ms is good enough for our needs. Further optimization can be done down the line when we get closer to release. The next step is to be able to support multiple particle textures, so we can have a special fire texture that turns into a special smoke texture when the particle is destroyed. So let’s do that now…

    (Click image to view the GIF) Now we can support multiple textures. But I look at what is finished compared to the requested features list and I groan. There is still lots of work to go, and we want to get this game finished. So what options do I have? Instead of maintaining a “not-invented here” mentality, let’s open Google and see what I can find…Well, hello Proton.

    step2

    After discovering Proton and diving into the examples and some of the source, I ask myself an interesting question: “This does almost everything we need, is fast, looks fantastic and is easy to implement. Why am I re-inventing the wheel?” Less than thirty minutes after setting myself on Proton, I had it integrated with our game engine and had some simple but cool looking effects being rendered.

    step4

    I greatly respect developers who build their own engines from the ground-up, it is a monstrous undertaking but choke full of rewards. But I’m of the opinion where if you’re not a successful developer with a large runway of cash saved up, you should drop the idea of building your own engine and use middleware; unless of course your goal is to create an awesome engine. But if your goal is to develop a game, pick an engine and create that game. There are no “best engines”, pick one and use it. Work around its flaws and keep pushing forward.

    Trying to build everything yourself will only add to your development time, and although it is greatly rewarding if you can pull it off, the time expense in order to develop all of these engines and components can push your small project into an area where it becomes significantly harder to break-even or even make a profit. If you want to develop for the long term, you have to find a balance. Of course, we all have different goals. Some people are not motivated by economic rewards, and are happy in knowing that they completed something of substance. But if you are starting out as a small developer, drop the “not created here” mentality and instead focus on creating the game you want to make. Don’t waste time picking the “best and greatest” engine, choose one that you understand and clicks and work around the flaws; even the biggest game engines will have its quirks.

    step5