ChatGPT解决这个技术问题 Extra ChatGPT

How can I create a UIColor from a hex string?

How can I create a UIColor from a hexadecimal string format, such as #00FF00?


T
Top-Master

I've found the simplest way to do this is with a macro. Just include it in your header and it's available throughout your project.

#define UIColorFromRGB(rgbValue) [UIColor colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 green:((float)((rgbValue & 0xFF00) >> 8))/255.0 blue:((float)(rgbValue & 0xFF))/255.0 alpha:1.0]

uicolor macro with hex values

Also formatted version of this code:

#define UIColorFromRGB(rgbValue) \
[UIColor colorWithRed:((float)((rgbValue & 0xFF0000) >> 16))/255.0 \
                green:((float)((rgbValue & 0x00FF00) >>  8))/255.0 \
                 blue:((float)((rgbValue & 0x0000FF) >>  0))/255.0 \
                alpha:1.0]

Usage:

label.textColor = UIColorFromRGB(0xBC1128);

Swift:

static func UIColorFromRGB(_ rgbValue: Int) -> UIColor! {
    return UIColor(
        red: CGFloat((Float((rgbValue & 0xff0000) >> 16)) / 255.0),
        green: CGFloat((Float((rgbValue & 0x00ff00) >> 8)) / 255.0),
        blue: CGFloat((Float((rgbValue & 0x0000ff) >> 0)) / 255.0),
        alpha: 1.0)
}

This is great except it doesn't do what the questioner asks, which is to convert a hex STRING into a UIColor. This converts an integer to a UIColor.
@MohamedA.Karim That is an example of returning a UIColor from a hex format integer (0x...) not a hex format string ("#..."). Great if that's what you want, but not what the questioner asked for.
@ScottKohlert Your line of code converts one hex format string (prefixed with "#") into another hex format string (prefixed with "0x"). It does not produce an integer.
How to use this like [UIColor whiteColor].CGColor?
To convert a hex format string to an integer for use with this macro, see stackoverflow.com/questions/3648411/….
T
Top-Master

A concise solution:

// Assumes input like "#00FF00" (#RRGGBB).
+ (UIColor *)colorFromHexString:(NSString *)hexString {
    unsigned rgbValue = 0;
    NSScanner *scanner = [NSScanner scannerWithString:hexString];
    [scanner setScanLocation:1]; // bypass '#' character
    [scanner scanHexInt:&rgbValue];
    return [UIColor colorWithRed:((rgbValue & 0xFF0000) >> 16)/255.0 green:((rgbValue & 0xFF00) >> 8)/255.0 blue:(rgbValue & 0xFF)/255.0 alpha:1.0];
}

And a good method for doing the reverse conversion (like if you're storing colors in core data / a remote database) can be found here - stackoverflow.com/questions/11884227/…
A perfect solution. If your hex string comes from a (very poorly documented) API, be sure to test against shorthand hex codes like #FFF or #FC0. You'll need to change them to #FFFFFF/#FFCCOO.
You might also want to add if ( [hexString rangeOfString:@"#"].location == 0 ) before the setScanLocation line to make the # optional.
can you please add the reverse method.
For the lazy: SWIFT version here.
T
Top-Master

I've got a solution that is 100% compatible with the hex format strings used by Android, which I found very helpful when doing cross-platform mobile development. It lets me use one color palate for both platforms. Feel free to reuse without attribution, or under the Apache license if you prefer.

#import "UIColor+HexString.h"

@interface UIColor(HexString)

+ (UIColor *) colorWithHexString: (NSString *) hexString;
+ (CGFloat) colorComponentFrom: (NSString *) string start: (NSUInteger) start length: (NSUInteger) length;

@end


@implementation UIColor(HexString)

+ (UIColor *) colorWithHexString: (NSString *) hexString {
    NSString *colorString = [[hexString stringByReplacingOccurrencesOfString: @"#" withString: @""] uppercaseString];
    CGFloat alpha, red, blue, green;
    switch ([colorString length]) {
        case 3: // #RGB
            alpha = 1.0f;
            red   = [self colorComponentFrom: colorString start: 0 length: 1];
            green = [self colorComponentFrom: colorString start: 1 length: 1];
            blue  = [self colorComponentFrom: colorString start: 2 length: 1];
            break;
        case 4: // #ARGB
            alpha = [self colorComponentFrom: colorString start: 0 length: 1];
            red   = [self colorComponentFrom: colorString start: 1 length: 1];
            green = [self colorComponentFrom: colorString start: 2 length: 1];
            blue  = [self colorComponentFrom: colorString start: 3 length: 1];          
            break;
        case 6: // #RRGGBB
            alpha = 1.0f;
            red   = [self colorComponentFrom: colorString start: 0 length: 2];
            green = [self colorComponentFrom: colorString start: 2 length: 2];
            blue  = [self colorComponentFrom: colorString start: 4 length: 2];                      
            break;
        case 8: // #AARRGGBB
            alpha = [self colorComponentFrom: colorString start: 0 length: 2];
            red   = [self colorComponentFrom: colorString start: 2 length: 2];
            green = [self colorComponentFrom: colorString start: 4 length: 2];
            blue  = [self colorComponentFrom: colorString start: 6 length: 2];                      
            break;
        default:
            [NSException raise:@"Invalid color value" format: @"Color value %@ is invalid.  It should be a hex value of the form #RBG, #ARGB, #RRGGBB, or #AARRGGBB", hexString];
            break;
    }
    return [UIColor colorWithRed: red green: green blue: blue alpha: alpha];
}

+ (CGFloat) colorComponentFrom: (NSString *) string start: (NSUInteger) start length: (NSUInteger) length {
    NSString *substring = [string substringWithRange: NSMakeRange(start, length)];
    NSString *fullHex = length == 2 ? substring : [NSString stringWithFormat: @"%@%@", substring, substring];
    unsigned hexComponent;
    [[NSScanner scannerWithString: fullHex] scanHexInt: &hexComponent];
    return hexComponent / 255.0;
}

@end 

Swift:

extension UIColor {
    convenience init?(hexString: String?) {
        let input: String! = (hexString ?? "")
            .replacingOccurrences(of: "#", with: "")
            .uppercased()
        var alpha: CGFloat = 1.0
        var red: CGFloat = 0
        var blue: CGFloat = 0
        var green: CGFloat = 0
        switch (input.count) {
        case 3 /* #RGB */:
            red = Self.colorComponent(from: input, start: 0, length: 1)
            green = Self.colorComponent(from: input, start: 1, length: 1)
            blue = Self.colorComponent(from: input, start: 2, length: 1)
            break
        case 4 /* #ARGB */:
            alpha = Self.colorComponent(from: input, start: 0, length: 1)
            red = Self.colorComponent(from: input, start: 1, length: 1)
            green = Self.colorComponent(from: input, start: 2, length: 1)
            blue = Self.colorComponent(from: input, start: 3, length: 1)
            break
        case 6 /* #RRGGBB */:
            red = Self.colorComponent(from: input, start: 0, length: 2)
            green = Self.colorComponent(from: input, start: 2, length: 2)
            blue = Self.colorComponent(from: input, start: 4, length: 2)
            break
        case 8 /* #AARRGGBB */:
            alpha = Self.colorComponent(from: input, start: 0, length: 2)
            red = Self.colorComponent(from: input, start: 2, length: 2)
            green = Self.colorComponent(from: input, start: 4, length: 2)
            blue = Self.colorComponent(from: input, start: 6, length: 2)
            break
        default:
            NSException.raise(NSExceptionName("Invalid color value"), format: "Color value \"%@\" is invalid.  It should be a hex value of the form #RBG, #ARGB, #RRGGBB, or #AARRGGBB", arguments:getVaList([hexString ?? ""]))
        }
        self.init(red: red, green: green, blue: blue, alpha: alpha)
    }
    
    static func colorComponent(from string: String!, start: Int, length: Int) -> CGFloat {
        let substring = (string as NSString)
            .substring(with: NSRange(location: start, length: length))
        let fullHex = length == 2 ? substring : "\(substring)\(substring)"
        var hexComponent: UInt64 = 0
        Scanner(string: fullHex)
            .scanHexInt64(&hexComponent)
        return CGFloat(Double(hexComponent) / 255.0)
    }
}

in colorComponentFrom:start:length:, shouldn't you have return hexComponent / 0xFF; // divide by 255, not 256 ? The largest hex value you should get back is 0xFF, thus that is what you should be dividing by 0xFF (255).
Good catch Sam. Edited to reflect the change.
This is great, cheers. Also, instead of a category on UIColor you could make one on NSString to be able to have syntax like [@"#538aa4" toColor]
This solution is great, I would suggest to add "Private" for the name of the private interface to avoid a compiler warning. @interface UIColor(Private)
Nice. You should put the other function in the interface, though.
T
Tommie C.

There's a nice post on how to tackle the OP's question of extracting a UIColor from a hex string. The solution presented below is different from others because it supports string values that may include '0x' or '#' prefixed to the hex string representation... (see usage)

Here's the main bit...

- (UIColor *)getUIColorObjectFromHexString:(NSString *)hexStr alpha:(CGFloat)alpha
{
  // Convert hex string to an integer
  unsigned int hexint = [self intFromHexString:hexStr];
 
  // Create a color object, specifying alpha as well
  UIColor *color =
    [UIColor colorWithRed:((CGFloat) ((hexint & 0xFF0000) >> 16))/255
    green:((CGFloat) ((hexint & 0xFF00) >> 8))/255
    blue:((CGFloat) (hexint & 0xFF))/255
    alpha:alpha];
 
  return color;
}

Helper method...

- (unsigned int)intFromHexString:(NSString *)hexStr
{
  unsigned int hexInt = 0;
 
  // Create scanner
  NSScanner *scanner = [NSScanner scannerWithString:hexStr];
 
  // Tell scanner to skip the # character
  [scanner setCharactersToBeSkipped:[NSCharacterSet characterSetWithCharactersInString:@"#"]];
 
  // Scan hex value
  [scanner scanHexInt:&hexInt];
 
  return hexInt;
}

Usage:

NSString *hexStr1 = @"123ABC";
NSString *hexStr2 = @"#123ABC";
NSString *hexStr3 = @"0x123ABC";

UIColor *color1 = [self getUIColorObjectFromHexString:hexStr1 alpha:.9];
NSLog(@"UIColor: %@", color1);
 
UIColor *color2 = [self getUIColorObjectFromHexString:hexStr2 alpha:.9];
NSLog(@"UIColor: %@", color2);
 
UIColor *color3 = [self getUIColorObjectFromHexString:hexStr3 alpha:.9];
NSLog(@"UIColor: %@", color3);

Complete Reference Article

Swift 2+

I've ported this solution to Swift 2.2. Note that I've changed the alpha parameter to use a default set to 1.0. I've also updated the int type to UInt32 as required by the NSScanner class in Swift 2.2.

func colorWithHexString(hexString: String, alpha:CGFloat = 1.0) -> UIColor {
    
    // Convert hex string to an integer
    let hexint = Int(self.intFromHexString(hexString))
    let red = CGFloat((hexint & 0xff0000) >> 16) / 255.0
    let green = CGFloat((hexint & 0xff00) >> 8) / 255.0
    let blue = CGFloat((hexint & 0xff) >> 0) / 255.0 
    
    // Create color object, specifying alpha as well
    let color = UIColor(red: red, green: green, blue: blue, alpha: alpha)
    return color
}

func intFromHexString(hexStr: String) -> UInt32 {
    var hexInt: UInt32 = 0
    // Create scanner
    let scanner: NSScanner = NSScanner(string: hexStr)
    // Tell scanner to skip the # character
    scanner.charactersToBeSkipped = NSCharacterSet(charactersInString: "#")
    // Scan hex value
    scanner.scanHexInt(&hexInt)
    return hexInt
}

Swift 4+

Using the same logic with changes applied for swift 4,

func colorWithHexString(hexString: String, alpha:CGFloat = 1.0) -> UIColor {
    
    // Convert hex string to an integer
    let hexint = Int(self.intFromHexString(hexStr: hexString))
    let red = CGFloat((hexint & 0xff0000) >> 16) / 255.0
    let green = CGFloat((hexint & 0xff00) >> 8) / 255.0
    let blue = CGFloat((hexint & 0xff) >> 0) / 255.0
    
    // Create color object, specifying alpha as well
    let color = UIColor(red: red, green: green, blue: blue, alpha: alpha)
    return color
}

func intFromHexString(hexStr: String) -> UInt32 {
    var hexInt: UInt32 = 0
    // Create scanner
    let scanner: Scanner = Scanner(string: hexStr)
    // Tell scanner to skip the # character
    scanner.charactersToBeSkipped = CharacterSet(charactersIn: "#")
    // Scan hex value
    scanner.scanHexInt32(&hexInt)
    return hexInt
}

Swift 5 (iOS 13)+

The following shows an update that works given the SDK deprecation of scanHexInt32. I've wrapped the code into a Swift playground file.

//: A UIKit based Playground for presenting user interface
  
import UIKit
import PlaygroundSupport

class MyViewController : UIViewController {
    override func loadView() {
        let view = UIView()
        view.backgroundColor = .white

        let label = UILabel()
        label.frame = CGRect(x: 150, y: 200, width: 200, height: 20)
        label.text = "Hello World!"
        label.textColor = colorWithHexString(hexString: "22F728")
        
        view.addSubview(label)
        self.view = view
    }
    
    func colorWithHexString(hexString: String, alpha:CGFloat = 1.0) -> UIColor {

        // Convert hex string to an integer
        let hexint = Int(self.intFromHexString(hexStr: hexString))
        let red = CGFloat((hexint & 0xff0000) >> 16) / 255.0
        let green = CGFloat((hexint & 0xff00) >> 8) / 255.0
        let blue = CGFloat((hexint & 0xff) >> 0) / 255.0

        // Create color object, specifying alpha as well
        let color = UIColor(red: red, green: green, blue: blue, alpha: alpha)
        return color
    }

    func intFromHexString(hexStr: String) -> UInt32 {
        var hexInt: UInt32 = 0
        // Create scanner
        let scanner: Scanner = Scanner(string: hexStr)
        // Tell scanner to skip the # character
        scanner.charactersToBeSkipped = CharacterSet(charactersIn: "#")
        // Scan hex value
        hexInt = UInt32(bitPattern: scanner.scanInt32(representation: .hexadecimal) ?? 0)
        return hexInt
    }
}
// Present the view controller in the Live View window
PlaygroundPage.current.liveView = MyViewController()

Color Hex References HTML Color Names and Codes Color Hex Color Codes


The Swift snippets posted here seem to misunderstand the purpose of optionals in Swift, which is to contain values which may never exist. The question to ask in whether a parameter needs to be an optional is whether someone may need the ability to set it to nil. Does it possibly make sense for alpha to ever be set to nil? Because this method gives people that ability, and if someone should decide to set alpha to nil, the forced unwrapping of that optional will invariably lead to a crash. I haven't edited it out, though, in case there's some justification of which I'm not aware.
@JonathanThornton - Thanks for the heads up. Resolved.
Working on a legacy project and the Objective-C solution works very well... except... oddly... @"#ffc107" and @"#e040fb" refuse to cooperate! Thoughts?
'scanHexInt32' was deprecated in macOS 10.15
@Sentry.co - Updated code to deal with deprecated method.
E
Ethan Strider

This is a function that takes a hex string and returns a UIColor.
(You can enter hex strings with either format: #ffffff or ffffff)

Usage:

var color1 = hexStringToUIColor("#d3d3d3")

Swift 4:

func hexStringToUIColor (hex:String) -> UIColor {
    var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()

    if (cString.hasPrefix("#")) {
        cString.remove(at: cString.startIndex)
    }

    if ((cString.count) != 6) {
        return UIColor.gray
    }

    var rgbValue:UInt32 = 0
    Scanner(string: cString).scanHexInt32(&rgbValue)

    return UIColor(
        red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
        green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
        blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
        alpha: CGFloat(1.0)
    )
}

Swift 3:

func hexStringToUIColor (hex:String) -> UIColor {
    var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()

    if (cString.hasPrefix("#")) {
        cString.remove(at: cString.startIndex)
    }

    if ((cString.characters.count) != 6) {
        return UIColor.gray
    }

    var rgbValue:UInt32 = 0
    Scanner(string: cString).scanHexInt32(&rgbValue)

    return UIColor(
        red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
        green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
        blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
        alpha: CGFloat(1.0)
    )
}

Swift 2:

func hexStringToUIColor (hex:String) -> UIColor {
    var cString:String = hex.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet() as NSCharacterSet).uppercaseString

    if (cString.hasPrefix("#")) {
      cString = cString.substringFromIndex(cString.startIndex.advancedBy(1))
    }

    if ((cString.characters.count) != 6) {
      return UIColor.grayColor()
    }

    var rgbValue:UInt32 = 0
    NSScanner(string: cString).scanHexInt(&rgbValue)

    return UIColor(
        red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
        green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
        blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
        alpha: CGFloat(1.0)
    )
}

Source: arshad/gist:de147c42d7b3063ef7bc


I've seen in SVG, where there is a small version of the hex string with 3 characters, like #F0F.
That is shorthand notation, where '#F0F' is equivalent to '#FF00FF'. It would be simple to write a function that checked for shorthand and expanded it.
S
Stan James

Use this Category :

in the file UIColor+Hexadecimal.h

#import <UIKit/UIKit.h>

@interface UIColor(Hexadecimal)

+ (UIColor *)colorWithHexString:(NSString *)hexString;

@end

in the file UIColor+Hexadecimal.m

#import "UIColor+Hexadecimal.h"

@implementation UIColor(Hexadecimal)

+ (UIColor *)colorWithHexString:(NSString *)hexString {
    unsigned rgbValue = 0;
    NSScanner *scanner = [NSScanner scannerWithString:hexString];
    [scanner setScanLocation:1]; // bypass '#' character
    [scanner scanHexInt:&rgbValue];

    return [UIColor colorWithRed:((rgbValue & 0xFF0000) >> 16)/255.0 green:((rgbValue & 0xFF00) >> 8)/255.0 blue:(rgbValue & 0xFF)/255.0 alpha:1.0];
}

@end

In Class you want use it :

#import "UIColor+Hexadecimal.h"

and:

[UIColor colorWithHexString:@"#6e4b4b"];

No alpha handling
a
aheze

You can make a extension like this

extension UIColor {
    convenience init(hex: UInt, alpha: CGFloat = 1) {
        self.init(
            red: CGFloat((hex & 0xFF0000) >> 16) / 255.0,
            green: CGFloat((hex & 0x00FF00) >> 8) / 255.0,
            blue: CGFloat(hex & 0x0000FF) / 255.0,
            alpha: alpha
        )
    }
}

And use it anywhere like this

let color1 = UIColor(hex: 0xffffff)
let color2 = UIColor(hex: 0xffffff, alpha: 0.2)

Dead simple, definitely my favorite.
Thanks, this one is great, so concise compared to other walls of text!
nice function but it does not in any way answer the question.
@mah This is exactly the question i think, how to create a UIColor from hexstring
@ManuGupta you might notice a complete lack of any string handling in this answer. A hex number is not a hex string. The question explicitly states a string and though it doesn't have quotes in it, #00FF00 is clearly intended to be a character string. As others have said it's simple, concise. But if it doesn't deal with strings it can't possibly answer a question asking how to deal with strings.
C
Community

A great Swift implementation (updated for Xcode 7) using extensions, pulled together from a variety of different answers and places. You will also need the string extensions at the end.

Use:

let hexColor = UIColor(hex: "#00FF00")

NOTE: I added an option for 2 additional digits to the end of the standard 6 digit hex value for an alpha channel (pass in value of 00-99). If this offends you, just remove it. You could implement it to pass in an optional alpha parameter.

Extension:

extension UIColor {

    convenience init(var hex: String) {
        var alpha: Float = 100
        let hexLength = hex.characters.count
        if !(hexLength == 7 || hexLength == 9) {
            // A hex must be either 7 or 9 characters (#RRGGBBAA)
            print("improper call to 'colorFromHex', hex length must be 7 or 9 chars (#GGRRBBAA)")
            self.init(white: 0, alpha: 1)
            return
        }

        if hexLength == 9 {
            // Note: this uses String subscripts as given below
            alpha = hex[7...8].floatValue
            hex = hex[0...6]
        }

        // Establishing the rgb color
        var rgb: UInt32 = 0
        let s: NSScanner = NSScanner(string: hex)
        // Setting the scan location to ignore the leading `#`
        s.scanLocation = 1
        // Scanning the int into the rgb colors
        s.scanHexInt(&rgb)

        // Creating the UIColor from hex int
        self.init(
            red: CGFloat((rgb & 0xFF0000) >> 16) / 255.0,
            green: CGFloat((rgb & 0x00FF00) >> 8) / 255.0,
            blue: CGFloat(rgb & 0x0000FF) / 255.0,
            alpha: CGFloat(alpha / 100)
        )
    }
}

String extensions:
Float source
Subscript source

extension String {

    /**
    Returns the float value of a string
    */
    var floatValue: Float {
        return (self as NSString).floatValue
    }

    /**
    Subscript to allow for quick String substrings ["Hello"][0...1] = "He"
    */
    subscript (r: Range<Int>) -> String {
        get {
            let start = self.startIndex.advancedBy(r.startIndex)
            let end = self.startIndex.advancedBy(r.endIndex - 1)
            return self.substringWithRange(start..<end)
        }
    }
}

Are you also using a String extension to get its subscript? eg stackoverflow.com/a/24144365/3220708
@CraigGrummitt oh my! Haha, yes. I have a decent compiled list of extensions and subscripts I sometimes (sadly) forget what is and is not included in the standard language feature set. I updated my answer including the source you gave. Not sure if I even got it from there but it sure looks close.
You might want to mention that it's a String extension. Also you seem to have missed String floatValue extension: stackoverflow.com/a/24088249/3220708 Other than that, good work!
Where is countElements() func?
countElements() was replaced with count() in Swift 1.2, it is built into the language. I updated my answer to reflect that.
A
Ashish Kakkad

There is no builtin conversion from a hexadecimal string to a UIColor (or CGColor) that I'm aware of. However, you can easily write a couple of functions for this purpose - for example, see iphone development accessing uicolor components


+1 If you scroll way down, the method in question is +colorWithHexString:.
@RobNapier +colorWithHexString: doesn't work. At least in my case. :)
L
Leo Dabus
extension UIColor {
    convenience init(hexaString: String, alpha: CGFloat = 1) {
        let chars = Array(hexaString.dropFirst())
        self.init(red:   .init(strtoul(String(chars[0...1]),nil,16))/255,
                  green: .init(strtoul(String(chars[2...3]),nil,16))/255,
                  blue:  .init(strtoul(String(chars[4...5]),nil,16))/255,
                  alpha: alpha)}
}

Usage:

let redColor       = UIColor(hexaString: "#FF0000")              // r 1,0 g 0,0 b 0,0 a 1,0
let transparentRed = UIColor(hexaString: "#FF0000", alpha: 0.5)  // r 1,0 g 0,0 b 0,0 a 0,5

Another option is to convert the hexavalue to an unsigned integer and extract the corresponding values from it:

extension UIColor {
    convenience init(hexaString: String, alpha: CGFloat = 1) {
        self.init(hexa: UInt(hexaString.dropFirst(), radix: 16) ?? 0, alpha: alpha)
    }
    convenience init(hexa: UInt, alpha: CGFloat = 1) {
        self.init(red:   .init((hexa & 0xff0000) >> 16) / 255,
                  green: .init((hexa & 0xff00  ) >>  8) / 255,
                  blue:  .init( hexa & 0xff    )        / 255,
                  alpha: alpha)
    }
}

let purpleColor       = UIColor(hexaString: "#FF00FF")    // r 1,0 g 0,0 b 1,0 a 1,0
let transparentYellow = UIColor(hexaString: "#FFFF00", alpha: 0.5)  // r 1,0 g 1,0 b 0,0 a 0,5

best short solution ever
A
Ashish Kakkad

I found a good UIColor category for this, UIColor+PXExtensions.

Usage: UIColor *mycolor = [UIColor pxColorWithHexValue:@"#BADA55"];

And, just in case the link to my gist fails, here is the actual implementation code:

//
//  UIColor+PXExtensions.m
//

#import "UIColor+UIColor_PXExtensions.h"

@implementation UIColor (UIColor_PXExtensions)

+ (UIColor*)pxColorWithHexValue:(NSString*)hexValue
{
    //Default
    UIColor *defaultResult = [UIColor blackColor];

    //Strip prefixed # hash
    if ([hexValue hasPrefix:@"#"] && [hexValue length] > 1) {
        hexValue = [hexValue substringFromIndex:1];
    }

    //Determine if 3 or 6 digits
    NSUInteger componentLength = 0;
    if ([hexValue length] == 3)
    {
        componentLength = 1;
    }
    else if ([hexValue length] == 6)
    {
        componentLength = 2;
    }
    else
    {
        return defaultResult;
    }

    BOOL isValid = YES;
    CGFloat components[3];

    //Seperate the R,G,B values
    for (NSUInteger i = 0; i < 3; i++) {
        NSString *component = [hexValue substringWithRange:NSMakeRange(componentLength * i, componentLength)];
        if (componentLength == 1) {
            component = [component stringByAppendingString:component];
        }
        NSScanner *scanner = [NSScanner scannerWithString:component];
        unsigned int value;
        isValid &= [scanner scanHexInt:&value];
        components[i] = (CGFloat)value / 256.0f;
    }

    if (!isValid) {
        return defaultResult;
    }

    return [UIColor colorWithRed:components[0]
                           green:components[1]
                            blue:components[2]
                           alpha:1.0];
}

@end

@PeterDuniho ok, I added the implementation code :)
C
Community

swift version. Use as a Function or an Extension.

  func UIColorFromRGB(colorCode: String, alpha: Float = 1.0) -> UIColor{
    var scanner = NSScanner(string:colorCode)
    var color:UInt32 = 0;
    scanner.scanHexInt(&color)
    
    let mask = 0x000000FF
    let r = CGFloat(Float(Int(color >> 16) & mask)/255.0)
    let g = CGFloat(Float(Int(color >> 8) & mask)/255.0)
    let b = CGFloat(Float(Int(color) & mask)/255.0)
    
    return UIColor(red: r, green: g, blue: b, alpha: CGFloat(alpha))
}
extension UIColor {
    convenience init(colorCode: String, alpha: Float = 1.0){
        var scanner = NSScanner(string:colorCode)
        var color:UInt32 = 0;
        scanner.scanHexInt(&color)
        
        let mask = 0x000000FF
        let r = CGFloat(Float(Int(color >> 16) & mask)/255.0)
        let g = CGFloat(Float(Int(color >> 8) & mask)/255.0)
        let b = CGFloat(Float(Int(color) & mask)/255.0)
        
        self.init(red: r, green: g, blue: b, alpha: CGFloat(alpha))
    }
}
let hexColorFromFunction = UIColorFromRGB("F4C124", alpha: 1.0)
let hexColorFromExtension = UIColor(colorCode: "F4C124", alpha: 1.0)

Hex Color

https://i.stack.imgur.com/Nlf4X.png


G
George Maisuradze

SWIFT 4

You can create a nice convenience constructor in the extension like this:

extension UIColor {
    convenience init(hexString: String, alpha: CGFloat = 1.0) {
        var hexInt: UInt32 = 0
        let scanner = Scanner(string: hexString)
        scanner.charactersToBeSkipped = CharacterSet(charactersIn: "#")
        scanner.scanHexInt32(&hexInt)

        let red = CGFloat((hexInt & 0xff0000) >> 16) / 255.0
        let green = CGFloat((hexInt & 0xff00) >> 8) / 255.0
        let blue = CGFloat((hexInt & 0xff) >> 0) / 255.0
        let alpha = alpha

        self.init(red: red, green: green, blue: blue, alpha: alpha)
    }
}

And use it later like

let color = UIColor(hexString: "#AABBCCDD")

'scanHexInt32' was deprecated in iOS 13.0
T
Tieme

This is another alternative.

- (UIColor *)colorWithRGBHex:(UInt32)hex
{
    int r = (hex >> 16) & 0xFF;
    int g = (hex >> 8) & 0xFF;
    int b = (hex) & 0xFF;

    return [UIColor colorWithRed:r / 255.0f
                           green:g / 255.0f
                            blue:b / 255.0f
                           alpha:1.0f];
}

alternative answer to an alternative question... not a reasonable answer to the question on this page though.
Z
Zorayr

You could use various online tools to convert a HEX string to an actual UIColor. Check out uicolor.org or UI Color Picker. The output would be converted into Objective-C code, like:

[UIColor colorWithRed:0.93 green:0.80 blue:0.80 alpha:1.0];

Which you could embed in your application. Hope this helps!


Another online tool, same name in fact, UI Color Picker.
Generally when people ask for help with code to solve a fairly simple problem like this, an answer that says "first go to some online site..." is really not even close to being the answer the asker wanted.
N
Nath

This is nice with cocoapod support

https://github.com/mRs-/HexColors

// with hash
NSColor *colorWithHex = [NSColor colorWithHexString:@"#ff8942" alpha:1];

// wihtout hash
NSColor *secondColorWithHex = [NSColor colorWithHexString:@"ff8942" alpha:1];

// short handling
NSColor *shortColorWithHex = [NSColor colorWithHexString:@"fff" alpha:1]

Beautiful stuff bro :)
g
gheese

Another version with alpha

#define UIColorFromRGBA(rgbValue) [UIColor colorWithRed:((float)((rgbValue & 0xFF000000) >> 24))/255.0 green:((float)((rgbValue & 0xFF0000) >> 16))/255.0 blue:((float)((rgbValue & 0xFF00) >> 8 ))/255.0 alpha:((float)((rgbValue & 0xFF))/255.0)]

h
hris.to

Swift equivalent of @Tom's answer, although receiving RGBA Int value to support transparency:

func colorWithHex(aHex: UInt) -> UIColor
{
    return UIColor(red: CGFloat((aHex & 0xFF000000) >> 24) / 255,
        green: CGFloat((aHex & 0x00FF0000) >> 16) / 255,
        blue: CGFloat((aHex & 0x0000FF00) >> 8) / 255,
        alpha: CGFloat((aHex & 0x000000FF) >> 0) / 255)
}

//usage
var color = colorWithHex(0x7F00FFFF)

And if you want to be able to use it from string you could use strtoul:

var hexString = "0x7F00FFFF"

let num = strtoul(hexString, nil, 16)

var colorFromString = colorWithHex(num)

M
Morgan Wilde

Here's a Swift 1.2 version written as an extension to UIColor. This allows you to do

let redColor = UIColor(hex: "#FF0000")

Which I feel is the most natural way of doing it.

extension UIColor {
  // Initialiser for strings of format '#_RED_GREEN_BLUE_'
  convenience init(hex: String) {
    let redRange    = Range<String.Index>(start: hex.startIndex.advancedBy(1), end: hex.startIndex.advancedBy(3))
    let greenRange  = Range<String.Index>(start: hex.startIndex.advancedBy(3), end: hex.startIndex.advancedBy(5))
    let blueRange   = Range<String.Index>(start: hex.startIndex.advancedBy(5), end: hex.startIndex.advancedBy(7))

    var red     : UInt32 = 0
    var green   : UInt32 = 0
    var blue    : UInt32 = 0

    NSScanner(string: hex.substringWithRange(redRange)).scanHexInt(&red)
    NSScanner(string: hex.substringWithRange(greenRange)).scanHexInt(&green)
    NSScanner(string: hex.substringWithRange(blueRange)).scanHexInt(&blue)

    self.init(
      red: CGFloat(red) / 255,
      green: CGFloat(green) / 255,
      blue: CGFloat(blue) / 255,
      alpha: 1
    )
  }
}

In Xcode 6.3.2 on the line that starts with let greenRange = ... I get an exception: fatal error: can not increment endIndex
@CliftonLabrum I've tested this on Xcode 7 beta 3, and it works the same. Are you still having this issue?
D
David Kyslenko

Swift 5, iOS 14

convenience init(hex: String, alpha: CGFloat = 1.0) {
    var hexFormatted: String = hex.trimmingCharacters(in: CharacterSet.whitespacesAndNewlines).uppercased()
    
    if hexFormatted.hasPrefix("#") {
        hexFormatted = String(hexFormatted.dropFirst())
    }
    
    assert(hexFormatted.count == 6, "Invalid hex code used.")
    
    var rgbValue: UInt64 = 0
    Scanner(string: hexFormatted).scanHexInt64(&rgbValue)
    
    self.init(red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
              green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
              blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
              alpha: alpha)
}

R
Renato Lochetti

Another implementation allowing strings like "FFF" or "FFFFFF" and using alpha:

+ (UIColor *) colorFromHexString:(NSString *)hexString alpha: (CGFloat)alpha{
    NSString *cleanString = [hexString stringByReplacingOccurrencesOfString:@"#" withString:@""];
    if([cleanString length] == 3) {
        cleanString = [NSString stringWithFormat:@"%@%@%@%@%@%@",
                       [cleanString substringWithRange:NSMakeRange(0, 1)],[cleanString substringWithRange:NSMakeRange(0, 1)],
                       [cleanString substringWithRange:NSMakeRange(1, 1)],[cleanString substringWithRange:NSMakeRange(1, 1)],
                       [cleanString substringWithRange:NSMakeRange(2, 1)],[cleanString substringWithRange:NSMakeRange(2, 1)]];
    }
    if([cleanString length] == 6) {
        cleanString = [cleanString stringByAppendingString:@"ff"];
    }

    unsigned int baseValue;
    [[NSScanner scannerWithString:cleanString] scanHexInt:&baseValue];

    float red = ((baseValue >> 24) & 0xFF)/255.0f;
    float green = ((baseValue >> 16) & 0xFF)/255.0f;
    float blue = ((baseValue >> 8) & 0xFF)/255.0f;

    return [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
}

R
RiceAndBytes

updated for swift 1.2

class func colorWithHexString (hex:String) -> UIColor {
    var cString: NSString = hex.stringByTrimmingCharactersInSet(NSCharacterSet.whitespaceAndNewlineCharacterSet()).uppercaseString

    if (cString.hasPrefix("#")) {
        cString = cString.substringFromIndex(1)
    }

    if (count(cString as String) != 6) {
        return UIColor.grayColor()
    }

    var rString: String = cString.substringToIndex(2)
    var gString: String = (cString.substringFromIndex(2) as NSString).substringToIndex(2)
    var bString: String = (cString.substringFromIndex(4) as NSString).substringToIndex(2)

    var r:CUnsignedInt = 0, g:CUnsignedInt = 0, b:CUnsignedInt = 0;
    NSScanner(string: rString).scanHexInt(&r)
    NSScanner(string: gString).scanHexInt(&g)
    NSScanner(string: bString).scanHexInt(&b)
    return UIColor(red: CGFloat(Float(r) / 255.0), green: CGFloat(Float(g) / 255.0), blue: CGFloat(Float(b) / 255.0), alpha: CGFloat(1))

}

B
Bartłomiej Semańczyk

Create elegant extension for UIColor:

extension UIColor {

convenience init(string: String) {

        var uppercasedString = string.uppercased()
        uppercasedString.remove(at: string.startIndex)

        var rgbValue: UInt32 = 0
        Scanner(string: uppercasedString).scanHexInt32(&rgbValue)

        let red = CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0
        let green = CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0
        let blue = CGFloat(rgbValue & 0x0000FF) / 255.0

        self.init(red: red, green: green, blue: blue, alpha: 1)
    }
}

Create red color:

let red = UIColor(string: "#ff0000") 

V
Vinayak
extension UIColor 
{
    class func fromHexaString(hex:String) -> UIColor
    {
        let scanner           = Scanner(string: hex)
        scanner.scanLocation  = 0
        var rgbValue: UInt64  = 0
        scanner.scanHexInt64(&rgbValue)

        return UIColor(
            red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
            green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
            blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
            alpha: CGFloat(1.0)
        )
    }
}

//you can call like this.

UIColor.fromHexaString(hex:3276b1)

M
MLQ

I ended up creating a category for UIColor that I can just reuse in my other projects, and added this function:

+ (UIColor *)colorFromHex:(unsigned long)hex
{
    return [UIColor colorWithRed:((float)((hex & 0xFF0000) >> 16))/255.0
                           green:((float)((hex & 0xFF00) >> 8))/255.0
                            blue:((float)(hex & 0xFF))/255.0
                           alpha:1.0];
}

The usage goes like:

UIColor *customRedColor = [UIColor colorFromHex:0x990000];

This is far faster than passing on a string and converting it to a number then shifting the bits.

You can also import the category from inside your .pch file so you can easily use colorFromHex everywhere in your app like it's built-in to UIColor:

#ifdef __OBJC__
    #import <UIKit/UIKit.h>
    #import <Foundation/Foundation.h>
    // Your other stuff here...
    #import "UIColor+HexColor.h"
#endif

At first I liked and tried the #define approach above. Yet like most defines with many () it was hard to extend and debug. I then fell back to the "Utilities" class method approach. This works but it introduces a new class name into the namespace. Then, I saw your posting and I like it a lot because it understands how to use the Objective-C language. Good show. I plan on making a similar solution that takes RGB decimal values (eg. red: 24 green: 104 blue: 255)
s
steveSarsawa
self.view.backgroundColor = colorWithHex(hex: yourColorCode)

Code for creating Color from hexaDecimalCode

func colorWithHex (hex:String) -> UIColor {
    var cString:String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased()

    if (cString.hasPrefix("#")) {
        cString.remove(at: cString.startIndex)
    }

    if ((cString.count) != 6) {
        return UIColor.gray
    }

    var rgbValue:UInt32 = 0
    Scanner(string: cString).scanHexInt32(&rgbValue)

    return UIColor(
        red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
        green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
        blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
        alpha: CGFloat(1.0)
    )
}

M
Manish Saini
 You Can Get UIColor From String Code Like
   circularSpinner.fillColor = [self getUIColorObjectFromHexString:@"27b8c8" alpha:9];

 //Function For Hex Color Use
    - (unsigned int)intFromHexString:(NSString *)hexStr
    {
        unsigned int hexInt = 0;

        // Create scanner
        NSScanner *scanner = [NSScanner scannerWithString:hexStr];

        // Tell scanner to skip the # character
        [scanner setCharactersToBeSkipped:[NSCharacterSet characterSetWithCharactersInString:@"#"]];

        // Scan hex value
        [scanner scanHexInt:&hexInt];

        return hexInt;
    }




    - (UIColor *)getUIColorObjectFromHexString:(NSString *)hexStr alpha:(CGFloat)alpha
    {
        // Convert hex string to an integer
        unsigned int hexint = [self intFromHexString:hexStr];

        // Create color object, specifying alpha as well
        UIColor *color =
        [UIColor colorWithRed:((CGFloat) ((hexint & 0xFF0000) >> 16))/255
                        green:((CGFloat) ((hexint & 0xFF00) >> 8))/255
                         blue:((CGFloat) (hexint & 0xFF))/255
                        alpha:alpha];

        return color;
    }

    /Function For Hex Color Use
    - (unsigned int)intFromHexString:(NSString *)hexStr
    {
        unsigned int hexInt = 0;

        // Create scanner
        NSScanner *scanner = [NSScanner scannerWithString:hexStr];

        // Tell scanner to skip the # character
        [scanner setCharactersToBeSkipped:[NSCharacterSet characterSetWithCharactersInString:@"#"]];

        // Scan hex value
        [scanner scanHexInt:&hexInt];

        return hexInt;
    }




    - (UIColor *)getUIColorObjectFromHexString:(NSString *)hexStr alpha:(CGFloat)alpha
    {
        // Convert hex string to an integer
        unsigned int hexint = [self intFromHexString:hexStr];

        // Create color object, specifying alpha as well
        UIColor *color =
        [UIColor colorWithRed:((CGFloat) ((hexint & 0xFF0000) >> 16))/255
                        green:((CGFloat) ((hexint & 0xFF00) >> 8))/255
                         blue:((CGFloat) (hexint & 0xFF))/255
                        alpha:alpha];

        return color;
    }

if u want use this code,then u call [self getUIColorObjectFromHexString:@"27b8c8" alpha:9]; only....
a
adali

I like to ensure the alpha besides the color, so i write my own category

+ (UIColor *) colorWithHex:(int)color {

    float red = (color & 0xff000000) >> 24;
    float green = (color & 0x00ff0000) >> 16;
    float blue = (color & 0x0000ff00) >> 8;
    float alpha = (color & 0x000000ff);

    return [UIColor colorWithRed:red/255.0 green:green/255.0 blue:blue/255.0 alpha:alpha/255.0];
}

easy to use like this

[UIColor colorWithHex:0xFF0000FF]; //Red
[UIColor colorWithHex:0x00FF00FF]; //Green
[UIColor colorWithHex:0x00FF00FF]; //Blue
[UIColor colorWithHex:0x0000007F]; //transparent black

L
Lucas Martins Juviniano

I created a convenience init for that:

extension UIColor {
convenience init(hex: String, alpha: CGFloat)
{
    let redH = CGFloat(strtoul(hex.substringToIndex(advance(hex.startIndex,2)), nil, 16))
    let greenH = CGFloat(strtoul(hex.substringWithRange(Range<String.Index>(start: advance(hex.startIndex, 2), end: advance(hex.startIndex, 4))), nil, 16))
    let blueH = CGFloat(strtoul(hex.substringFromIndex(advance(hex.startIndex,4)), nil, 16))

    self.init(red: redH/255, green: greenH/255, blue: blueH/255, alpha: alpha)
}
}

then you can create an UIColor anywhere in your project just like this:

UIColor(hex: "ffe3c8", alpha: 1)

hope this helps...


A
Abhishek Verma

You can create extension class of UIColor as:-

extension UIColor {

// MARK: - getColorFromHex /** This function will convert the color Hex code to RGB.

- parameter color  hex string.

- returns: RGB color code.
*/
class func getColorFromHex(hexString:String)->UIColor{

    var rgbValue : UInt32 = 0
    let scanner:NSScanner =  NSScanner(string: hexString)

    scanner.scanLocation = 1
    scanner.scanHexInt(&rgbValue)

    return UIColor(red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0, green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0, blue: CGFloat(rgbValue & 0x0000FF) / 255.0, alpha: CGFloat(1.0))
}

}